Cryogenic distillation is the only technique with the capacity to handle the hydrogen isotope separation requirements of a fusion power plant. However, there are safety and cost considerations associated with the considerable tritium inventory that can accumulate in such an isotope separation system (ISS). The ISS must be able to reliably produce specified products while responding to varying input streams. To design an ISS that balances all of these considerations and operate it reliably, it is essential to have a computer model of the system. This allows for a better understanding of the system and the exploration of various parameter regions that would otherwise require very expensive experimentation. The value of such a model, however, is questionable until it is validated by comparison with actual experiments. Recently, as part of the Annex IV US/Japan collaboration, a series of tests were conducted on the ISS system at the Tritium Systems Test Assembly (TSTA) located at Los Alamos National Laboratory (LANL). This system has a fusion power plant-relevant capacity of 6 SLPM (standard liters per minute). These experiments employed light hydrogen (protium), deuterium and tritium. Conditions at five steady state conditions were measured. The measurements included concentration measurements at the column feed, top and bottom, and also at intermediate points. These measurements served as a benchmark for comparison to DYNSIM, the model that has been in use at LANL for many years. This model was able to accurately predict the column concentration profile based on the measured pressure, temperature, reboiler heat, feed composition and flows for a set of significantly different operating conditions. These results impart confidence that the model is useful for future ISS design and for better understanding of existing system operations.