Certain reactor transients cause a reduction in moderator temperature and, hence, increased attenuation of neutrons and decreased response of excore detectors. This decreased detector response is of concern because of the credit assumed for detector-initiated reactor trip to terminate the transient. Explicit modeling of this phenomenon presents the analyst with a difficult problem because of the dense and optically thick neutron absorption media, given the constraint that precise response characteristics must be known in order to account for this phenomenon. The solution in this study was judged to be the use of Monte Carlo techniques coupled with robust variance reduction to accelerate problem convergence. A fresh discussion on the motivation for variance reduction is included, followed by separate accounts of manual and automated applications of variance reduction techniques. Finally, the results of both manual and automated variance reduction techniques are presented and compared.