A controlled-calorimetric in-core instrument that can directly measure nuclear energy deposition has been developed and tested. This instrument works by heating an element of reactor fuel to a constant temperature with an electric heater, such that input electrical power is inversely related to the deposited nuclear power. Tests on first-generation sensor prototypes and subsequent modeling showed three problems: lack of proportionality in the relative neutron and photon response, a relatively low bandwidth, and drift. A model of the sensor has been developed and used to optimize the design of second-generation prototypes with respect to these three problems. Study of the predicted relative neutron and gamma response showed that a nonuniform distribution of nuclear and electrical energy deposition caused the temperature distribution within the sensor to change as the ratio of the energy components varies. This affects sensor power proportionality and increases response time. Heat transfer through the sensor power leads was demonstrated to cause most of the observed drift. The proposed second-generation sensor design forces almost all of the temperature gradient into a thin metal axial region, which gives uniform energy distribution from all sources and better control of thermal leakage and contact resistances. This results in a prediction of increased bandwidth with improved proportionality.