The Inverse Depletion Theory (INDEPTH) code is one of the tools being used to analyze the traditional nondestructive assay (NDA) measurements and verify the initial enrichment, burnup, and cooling time values of spent nuclear fuel (SNF) declared by facilities. The INDEPTH code attempts to reconstruct the initial enrichment and operating history by using the Oak Ridge Isotope Generation (ORIGEN) code to simulate irradiation and cooling of the fuel. This work examined the sensitivity of INDEPTH results to variations in irradiation conditions. Three types of measured data were simulated to identify possible sources of systematic error. An absolute gamma measurement with a gross neutron count produced more accurate answers than either the relative gamma measurement or the absolute gamma measurement by itself in most cases. However, long shutdown times between irradiation cycles were found to greatly affect the accuracy, with the absolute gamma plus gross neutron counts case losing the most accuracy. In these cases, the added neutron data either did not significantly improve the results or made them worse.