DOI: 10.1615/TSFP6
NUMERICAL ERRORS IN SCALAR VARIANCE MODELS FOR LARGE EDDY SIMULATION
要約
The subfilter scalar variance is an important quantity in conserved scalar models for large eddy simulation of turbulent combustion, where it indicates the degree of small scale mixing between fuel and oxidizer. Simulation predictions of chemical species concentrations are sensitive to the accuracy of the variance model, including the numerical accuracy with which the model is evaluated. A priori analysis shows that both dynamic algebraic models and transport equation models for the variance incur significant numerical error when calculated using finite difference methods. Furthermore, the amount of error cannot be reliably characterized by the order of accuracy of the finite difference scheme.
Errors in the filtered scalar field also contribute to the error in the variance estimation. In particular, the variance values predicted by a dynamic model are highly dependent on the evolution of the smallest resolved scales, which in turn are greatly influenced by the numerical treatment of the diffusive term. In a posteriori tests, a simple expansion of the term is found to improve the accuracy of the scalar evolution and, thereby, increase the accuracy of dynamic model predictions of the variance. In contrast to the a priori results, the combined numerical errors of the scalar equation and variance model evaluation cause the magnitude of modeled variance values to increase as the order of accuracy of the finite difference scheme decreases.