Nuclear Science and Engineering / Volume 198 / Number 7 / July 2024 / Pages 1426-1438
Research Article / dx.doi.org/10.1080/00295639.2023.2180987
Articles are hosted by Taylor and Francis Online.
Traditional one-dimensional system thermal-hydraulic analysis has been widely applied in the nuclear industry for licensing purposes because of its numerical efficiency. However, such tools have inherently limited opportunities for modeling multiscale multidimensional flows in large reactor enclosures. Recent interest in three-dimensional coarse grid (CG) simulations has shown their potential in improving the predictive capability of system-level analysis. At the same time, CGs do not allow one to accurately resolve and capture turbulent mixing and stratification, whereas implemented in CG solvers relatively simple turbulence models exhibit large model form uncertainties. Therefore, there is a strong interest in further advances in CG modeling techniques. In this work, two high-to-low data-driven (DD) methodologies (and their combination) are explored to reduce grid and model-induced errors using a case study based on the Texas A&M upper plenum of a high-temperature gas-cooled reactor facility. The first approach relies on the use of a DD turbulence closure [eddy viscosity predicted by a neural network (NN)]. A novel training framework is suggested to consider the influence of grid cell size on closure. The second methodology uses a NN to predict velocity errors to improve low-fidelity results. Both methodologies and their combination have shown the potential to improve CG simulation results by using data with higher fidelity.