In this paper, the concept of information divergence, based on Kullback’s information measure, is introduced into reactor noise analysis. Information divergence, as introduced by Kullback, is the total average information measuring the separation or dissimilarity between two classes of statistical populations. A new species of information divergence is proposed that applies information divergence theory to stochastic processes in general and the reactor noise process in particular. Using this information divergence, the pattern discrimination of reactor noise for a subcritical reactor is studied. Results show that the new information divergence provides a direct quantitative measure of differences between two noise patterns in cases where such a discrimination is not possible from a direct comparison of conventional correlation functions. Functions based on the new information divergence and conventional correlations are proposed for potential applications. These functions are presented as alternative approaches for pattern recognition methodologies of reactor noise used in reactor diagnostics.