Nondestructive measurement of tritium (T) content in solid materials is important for safe and cost-effective disposal of contaminated wastes, and beta-ray induced X-ray spectrometry (BIXS) has been developed for this purpose. A common way to obtain depth profiles of T in solids using BIXS is to perform simulation of X-ray spectra for assumed depth profiles and find a profile giving the best agreement with observation. A detailed understanding of attenuation of low-energy X-rays (≤18.6 keV) by detector components such as a window material is required for interpretation of measured spectra and simulation. In this study, BIXS spectra of a tungsten reference sample with known T depth profile were measured using two different semiconductor detectors and simulated using the Monte Carlo simulation toolkit Geant4. In the low-energy region (<2 keV), the difference in internal structure resulted in a noticeable difference in the BIXS spectra. The disagreement between the measured and the simulated spectra was also significant at <2 keV. Nevertheless, at >2 keV, the BIXS spectra were insensitive to the internal structure of the detector, and the simulated spectra agreed well with the measured ones. The mechanism underlying the difference in the low-energy region was discussed.