ANS is committed to advancing, fostering, and promoting the development and application of nuclear sciences and technologies to benefit society.
Explore the many uses for nuclear science and its impact on energy, the environment, healthcare, food, and more.
Explore membership for yourself or for your organization.
Conference Spotlight
Nuclear Energy Conference & Expo (NECX)
September 8–11, 2025
Atlanta, GA|Atlanta Marriott Marquis
Standards Program
The Standards Committee is responsible for the development and maintenance of voluntary consensus standards that address the design, analysis, and operation of components, systems, and facilities related to the application of nuclear science and technology. Find out What’s New, check out the Standards Store, or Get Involved today!
Latest Magazine Issues
Aug 2025
Jan 2025
Latest Journal Issues
Nuclear Science and Engineering
September 2025
Nuclear Technology
Fusion Science and Technology
August 2025
Latest News
Powering the future: How the DOE is fueling nuclear fuel cycle research and development
As global interest in nuclear energy surges, the United States must remain at the forefront of research and development to ensure national energy security, advance nuclear technologies, and promote international cooperation on safety and nonproliferation. A crucial step in achieving this is analyzing how funding and resources are allocated to better understand how to direct future research and development. The Department of Energy has spearheaded this effort by funding hundreds of research projects across the country through the Nuclear Energy University Program (NEUP). This initiative has empowered dozens of universities to collaborate toward a nuclear-friendly future.
B. L. Broadhead, B. T. Rearden, C. M. Hopper, J. J. Wagschal, C. V. Parks
Nuclear Science and Engineering | Volume 146 | Number 3 | March 2004 | Pages 340-366
Technical Paper | doi.org/10.13182/NSE03-2
Articles are hosted by Taylor and Francis Online.
The theoretical basis for the application of sensitivity and uncertainty (S/U) analysis methods to the validation of benchmark data sets for use in criticality safety applications is developed. Sensitivity analyses produce energy-dependent sensitivity coefficients that give the relative change in the system multiplication factor keff value as a function of relative changes in the cross-section data by isotope, reaction, and energy. Integral indices are then developed that utilize the sensitivity information to quantify similarities between pairs of systems, typically a benchmark experiment and design system. Uncertainty analyses provide an estimate of the uncertainties in the calculated values of the system keff due to cross-section uncertainties, as well as correlation in the keff uncertainties between systems. These uncertainty correlations provide an additional measure of system similarity. The use of the similarity measures from both S/U analyses in the formal determination of areas of applicability for benchmark experiments is developed. Furthermore, the use of these similarity measures as a trending parameter for the estimation of the computational bias and uncertainty is explored. The S/U analysis results, along with the calculated and measured keff values and estimates of uncertainties in the measurements, were used in this work to demonstrate application of the generalized linear-least-squares methodology (GLLSM) to data validation for criticality safety studies.An illustrative example is used to demonstrate the application of these S/U analysis procedures to actual criticality safety problems. Computational biases, uncertainties, and the upper subcritical limit for the example applications are determined with the new methods and compared to those obtained through traditional criticality safety analysis validation techniques.The GLLSM procedure is also applied to determine cutoff values for the similarity indices such that applicability of a benchmark experiment to a criticality safety design system can be assured. Additionally, the GLLSM procedure is used to determine how many applicable benchmark experiments exceeding a certain degree of similarity are necessary for an accurate assessment of the computational bias.