ANS is committed to advancing, fostering, and promoting the development and application of nuclear sciences and technologies to benefit society.
Explore the many uses for nuclear science and its impact on energy, the environment, healthcare, food, and more.
Division Spotlight
Materials Science & Technology
The objectives of MSTD are: promote the advancement of materials science in Nuclear Science Technology; support the multidisciplines which constitute it; encourage research by providing a forum for the presentation, exchange, and documentation of relevant information; promote the interaction and communication among its members; and recognize and reward its members for significant contributions to the field of materials science in nuclear technology.
Meeting Spotlight
2024 ANS Annual Conference
June 16–19, 2024
Las Vegas, NV|Mandalay Bay Resort and Casino
Standards Program
The Standards Committee is responsible for the development and maintenance of voluntary consensus standards that address the design, analysis, and operation of components, systems, and facilities related to the application of nuclear science and technology. Find out What’s New, check out the Standards Store, or Get Involved today!
Latest Magazine Issues
May 2024
Jan 2024
Latest Journal Issues
Nuclear Science and Engineering
June 2024
Nuclear Technology
Fusion Science and Technology
Latest News
Excelsior University student section awarded community education grant
The American Nuclear Society Student Section at Excelsior University in Albany, N.Y., was awarded a $5,000 grant from the ANS Student Section Strategic Fund initiative for its program, Empowering Tomorrow’s Nuclear Innovators: A Collaborative Approach to Nuclear Technology Education and Awareness.
B. L. Broadhead, B. T. Rearden, C. M. Hopper, J. J. Wagschal, C. V. Parks
Nuclear Science and Engineering | Volume 146 | Number 3 | March 2004 | Pages 340-366
Technical Paper | doi.org/10.13182/NSE03-2
Articles are hosted by Taylor and Francis Online.
The theoretical basis for the application of sensitivity and uncertainty (S/U) analysis methods to the validation of benchmark data sets for use in criticality safety applications is developed. Sensitivity analyses produce energy-dependent sensitivity coefficients that give the relative change in the system multiplication factor keff value as a function of relative changes in the cross-section data by isotope, reaction, and energy. Integral indices are then developed that utilize the sensitivity information to quantify similarities between pairs of systems, typically a benchmark experiment and design system. Uncertainty analyses provide an estimate of the uncertainties in the calculated values of the system keff due to cross-section uncertainties, as well as correlation in the keff uncertainties between systems. These uncertainty correlations provide an additional measure of system similarity. The use of the similarity measures from both S/U analyses in the formal determination of areas of applicability for benchmark experiments is developed. Furthermore, the use of these similarity measures as a trending parameter for the estimation of the computational bias and uncertainty is explored. The S/U analysis results, along with the calculated and measured keff values and estimates of uncertainties in the measurements, were used in this work to demonstrate application of the generalized linear-least-squares methodology (GLLSM) to data validation for criticality safety studies.An illustrative example is used to demonstrate the application of these S/U analysis procedures to actual criticality safety problems. Computational biases, uncertainties, and the upper subcritical limit for the example applications are determined with the new methods and compared to those obtained through traditional criticality safety analysis validation techniques.The GLLSM procedure is also applied to determine cutoff values for the similarity indices such that applicability of a benchmark experiment to a criticality safety design system can be assured. Additionally, the GLLSM procedure is used to determine how many applicable benchmark experiments exceeding a certain degree of similarity are necessary for an accurate assessment of the computational bias.