Nuclear Science and Engineering / Volume 160 / Number 1 / September 2008 / Pages 98-107
Technical Paper / dx.doi.org/10.13182/NSE07-38
In order to increase the power generation efficiency of nuclear reactors, the utilities of light water reactors have opted for power uprates in the past decades. Upon a power uprate, the power density and coolant flow rate of a nuclear reactor would change immediately, followed by water chemistry variations due to enhanced radiolysis of water and shortened coolant residence times. If the boiling water reactor (BWR) has adopted hydrogen water chemistry (HWC) for corrosion mitigation, the optimal hydrogen injection rate may thus require a proper adjustment. Because of limited measurable water chemistry data, a well-developed computer code DEMACE was used in the current study to investigate the impact of various power levels (ranging from 100 to 120%) on the redox species concentrations and electrochemical corrosion potential (ECP) behavior of components in the primary coolant circuit of a domestic BWR operating under either normal water chemistry or HWC. Our analyses indicated that the chemical species concentrations and the ECP did not vary monotonically with increases in reactor power level at a fixed feedwater hydrogen concentration. In particular, the upper plenum and the upper downcomer regions exhibited uniquely higher ECPs at 104 and 114% power levels than those at the other evaluated power levels. Accordingly, the impact of power uprate on the HWC effectiveness in a BWR is expected to vary from location to location and eventually from plant to plant because of different degrees of radiolysis and physical dimensions.