Burnup measurement is an important step in material control and accountancy at nuclear reactors and may be done by examining gamma spectra of fuel samples. Traditional approaches rely on known correlations to specific photopeaks (e.g., Cs) and operate via a standard linear regression method. However, the quality of these regression methods is limited even in the best case and is significantly poorer at short fuel cooldown times, due to the elevated radiation background by short-lifetime isotopes and self-shielding effect of the fuel. For practical operation of pebble bed reactors (PBRs), quick measurements (in minutes) and short cooling times (in hours) are required from a safety and security perspective. We investigated the efficacy and performance of machine learning (ML) methods to predict the burnup of the pebble fuel from full gamma spectra (rather than specific discrete photopeaks) and found a full-spectrum ML approach to far outperform baseline regression predictions in all measurement and cooling conditions, including in operational-like measurement conditions. We also performed model and data ablation experiments to determine the relative performance impact of our ML methods’ capacity to model data nonlinearities and the inherent additional information in full spectra. Applying our ML methods, we found a number of surprising results, including improved accuracy at shorter fuel cooling times (the opposite of the norm), remarkable robustness to spectrum compression (via rebinning), and competitive burnup predictions even when using a background signal only (i.e., explicitly omitting known isotope photopeaks).