Commercial inertial fusion energy power plants will require 5-20Hz fusion target injection rates for utility-scale power production. To mitigate damage from target emission, some designs include a buffer gas in the chamber to reduce heat and particle fluxes to the chamber wall. The evolution of chamber environment between shots is an important issue as residual heat and eddies in the gas pose a serious threat on target survival during injection and target trajectory.

We have simulated the evolution of a direct-drive IFE chamber with helium, deuterium, and xenon buffer gases at several densities. To evaluate the link between these simulations and the risk posed to a direct-drive target, we modify an analytical expression of the free-molecular heat flux on a surface element to account for the possibility of chamber gas condensation on the target. We show this expression compares favorably with Monte Carlo simulations in the same gas regime. These results are used to estimate risk for target survival based on several target heating failure modes. Though lower density chamber gas would improve target survival, experimental quantification of several key gas-surface interaction coefficients for cryogenic targets could open the chamber gas design window.