Watch the full webinar here.
Background and stats: RP3C is a special committee created by the ANS Standards Board and chaired by Steven Krahn that provides guidance to ANS standards committees on the use of risk-informed, performance-based (RIPB) methods. The CoP is part of RP3C’s charter, which includes training and knowledge-sharing of RIPB principles to exchange ideas outside of the normal management and project processes. Over the past five years, CoPs have frequently been used by organizations to help break down barriers that impede the flow of information.
Benson’s presentation was the final installment of the year and was one of 10 CoPs held in 2025. Those presentations ranged in topic from digital frameworks to advanced reactors to external hazards and more. They were presented by leaders from universities, national laboratories, and private industry. On average, each presentation was attended by about 50 people, and the online recordings have been viewed on average 220 times.
The presentation: This most recent presentation was a bit different from the average CoP. For one, Benson does not hold a degree in a nuclear discipline. His expertise is in hydrology and geological and environmental engineering, and he works on the back end of radioactive waste operations, largely in mill tailings disposal, low-level waste, and mixed-level waste facilities.
However, his presentation fits right in with the broader CoP programming because, as he put it, “all of those disposal facilities for [waste] materials are designed and implemented based on a risk-informed, performance-based approach.”
Benson’s presentation specifically focused on the details of a 22-year-long field monitoring program at the Monticello Uranium Mill Tailings Disposal Site in Utah. It particularly focused on the hydrologic monitoring of the water balance final cover that was placed over the waste at the site.
According to Benson, “The waste disposal unit at the site relies on a natural system type of design for the containment.” He explained how his team demonstrated how that design functioned as his performance assessment modeling predicted.
Proving the validity and value of that modeling was particularly important in the early 1990s, when this project began. At that time, Benson explained, “There was a great deal of skepticism about whether we could actually make predictions and develop conceptual designs that would be born out in the field.” He continued, “There was a real need, then, to validate what we had in our performance assessment with full-scale field data.”