
Primary calibration of two gaseous radiation detectors is conducted in a 14-inch pipe. (Photo: RSCS)

Transfer calibration sources show wear of the gelatin and cesium salt surfaces from improper handling. (Photo: RSCS)
Events related to radiation monitoring systems continue to be observed in industry where calibration frequencies for particular types of detectors have been extended inappropriately by the plant operator because of the detector degradation and failure mechanisms associated with a particular detector type. The calibration frequency of some radiation detectors could easily be extended for 10 years or even run-to-failure (i.e., Geiger-Müller tubes), which is acceptable because of the detectors’ failure mechanisms. In other detectors, such as scintillation and solid-state designs, extended calibration frequencies and improper detector maintenance have unknowingly and adversely affected monitor operability on several occasions. Unfortunately, these are not isolated instances where improper discriminator and gain adjustments have led to degraded or inoperable monitors over the past 30 years. Other issues exist with the traceability of transfer calibration sources to the primary calibration and the problem of transfer source decay. A common contributing factor for the improper calibration and maintenance of radiation monitors is a loss of expertise in the principles of radiation detection. Too often, the nuclear industry delegates calibration and maintenance of monitors solely to maintenance workers without the benefit of oversight by a radiation detection expert. There has been a significant talent drain in the past 40 years; the dwindling pool of radiation detection experts has suffered during this attrition of talent, and industry-wide experience in the proper maintenance, calibration, and application of radiation monitoring systems has likewise been challenged.
Technology obsolescence is also a significant problem—several suppliers of installed radiation monitors no longer exist. Considering the age of some existing radiation monitors (from the 1970s to the 1990s), wholesale monitor replacement is often considered to be the only way to ensure monitor reliability. Complete monitor replacement costs are largely proportional to the level of difficulty of positioning a new monitor in the proximity of—or in place of—the old monitor, especially when replacement was never considered in the plant design. Logistical and space limitations can result in tremendous costs. However, full-scale monitor replacement may not always be required. Simpler electronic upgrades and perhaps detector and pump substitutions could easily extend a monitor’s lifetime.
Nuclear power plants should undertake independent assessments, perhaps with the aid of external experts, of the health of their radiation monitoring systems. These assessments should look at calibration bases and the traceability of the transfer source response to the primary calibration; calibration frequency and methodology, especially gain and discriminator adjustments; system failure history; availability of substitute parts to address obsolescence; and the monitor’s performance capability. Improvements could range from procedure modifications with protected references, electronic cabinet retrofits, and pump and detector retrofits up to full monitor replacements. Training should be provided to maintenance personnel performing radiation monitor work so they may be familiarized with radiation detection principles and the actual effects of calibration adjustments. Because the Maintenance Rule prioritizes maintenance activities and most radiation monitoring systems fall outside of the scope of this, it detracts from the importance of these systems as a whole.
Billy Cox is the manager of technical services at Radiation Safety and Control Services. Eric Darois is one of the principals and executive directors at RSCS.