It is now common practice in nuclear engineering to base extensive studies on numerical computer models. These studies require running computer codes in potentially thousands of numerical configurations and without expert individual controls on the computational and physical aspects of each simulation. In this paper, we compare different statistical metamodeling techniques and show how metamodels can help improve the global behavior of codes in these extensive studies. We consider the metamodeling of the Germinal thermomechanical code by Kriging, kernel regression, and neural networks. Kriging provides the most accurate predictions, while neural networks yield the fastest metamodel functions. All three metamodels can conveniently detect strong computation failures. However, it is more challenging to detect code instabilities, that is, groups of computations that are all valid but numerically inconsistent with one another. For code instability detection, we find that Kriging provides an interesting tool.