ANS is committed to advancing, fostering, and promoting the development and application of nuclear sciences and technologies to benefit society.
Explore the many uses for nuclear science and its impact on energy, the environment, healthcare, food, and more.
Explore membership for yourself or for your organization.
Conference Spotlight
2026 ANS Annual Conference
May 31–June 3, 2026
Denver, CO|Sheraton Denver
Latest Magazine Issues
Feb 2026
Jul 2025
Latest Journal Issues
Nuclear Science and Engineering
March 2026
Nuclear Technology
February 2026
Fusion Science and Technology
January 2026
Latest News
Hanford begins removing waste from 24th single-shell tank
The Department of Energy’s Office of Environmental Management said crews at the Hanford Site near Richland, Wash., have started retrieving radioactive waste from Tank A-106, a 1-million-gallon underground storage tank built in the 1950s.
Tank A-106 will be the 24th single-shell tank that crews have cleaned out at Hanford, which is home to 177 underground waste storage tanks: 149 single-shell tanks and 28 double-shell tanks. Ranging from 55,000 gallons to more than 1 million gallons in capacity, the tanks hold around 56 million gallons of chemical and radioactive waste resulting from plutonium production at the site.
Pengfei Fu, Licao Dai
Nuclear Technology | Volume 212 | Number 2 | February 2026 | Pages 476-489
Regular Research Article | doi.org/10.1080/00295450.2025.2472526
Articles are hosted by Taylor and Francis Online.
In human reliability analysis (HRA) for nuclear power plants, cognitive modeling–based approaches require the development of an operator knowledge base to simulate the cognitive processes of operators. However, existing automatic extraction methods fail to provide knowledge that meets the granularity requirements of cognitive modeling for the development of the operator knowledge base.
To address this gap, this paper proposes a deep learning–based extraction method. Specifically, the method utilizes a bidirectional encoder representations from transformers (BERT)–bidirectional long short-term memory (Bi-LSTM)–1conditional random field (CRF) model to perform sequence labeling for extracting fine-grained knowledge, such as entities and their corresponding states, as well as the causal relationships between these pieces of knowledge. Additionally, we define mapping rules to structure the extracted causal knowledge to facilitate the integration of additional knowledge.
To validate the extraction effectiveness of the BERT-Bi-LSTM-CRF model, experiments were conducted on a data set constructed from licensee event reports. The experimental results showed that the model achieved a macro-F1 score of 0.876 on the test set, indicating that the model is capable of effectively extracting the required knowledge and relationships from unstructured text. This method is expected to be applied in the development of operator knowledge bases, potentially reducing the workload involved.