Begell House
International Journal for Uncertainty Quantification
International Journal for Uncertainty Quantification
2152-5080
5
6
2015
AN ERROR SUBSPACE PERSPECTIVE ON DATA ASSIMILATION
Two families of methods are widely used in data assimilation: the four-dimensional variational (4D-Var) approach, and the ensemble Kalman filter (EnKF) approach. The two families have been developed largely through parallel research efforts. Each method has its advantages and disadvantages. It is of interest to develop hybrid data assimilation algorithms that can combine the relative strengths of the two approaches. This paper proposes a subspace approach to investigate the theoretical equivalence between the suboptimal 4D-Var method (where only a small number of optimization iterations are performed) and the practical EnKF method (where only a small number of ensemble members are used) in a linear setting. The analysis motivates a new hybrid algorithm: the optimization directions obtained from a short window 4D-Var run are used to construct the EnKF initial ensemble. The proposed hybrid method is computationally less expensive than a full 4D-Var, as only short assimilation windows are considered. The hybrid method has the potential to perform better than the regular EnKF due to its look-ahead property. Numerical results show that the proposed hybrid ensemble filter method performs better than the regular EnKF method for the test problem considered.
Adrian
Sandu
Computational Science Laboratory, Department of Computer Science, Virginia Polytechnic Institute and State University, 2201 Knowledgeworks II, 2202 Kraft Drive, Blacksburg, Virginia 24060, USA
Haiyan
Cheng
Department of Computer Science, Willamette University, 900 State Street, Salem, Oregon 97301, USA
491-510
BAYESIAN APPROACH TO THE STATISTICAL INVERSE PROBLEM OF SCATTEROMETRY: COMPARISON OF THREE SURROGATE MODELS
Scatterometry provides a fast indirect optical method for the determination of grating geometry parameters of photomasks and is used in mask metrology. To obtain a desired parameter, inverse methods like least squares or the maximum likelihood method are frequently used. A different method, the Bayesian approach, has many advantages against the others, but it is often not used for scatterometry due to the large computational costs. In this paper, we introduce different surrogate models to approximate computationally expensive calculations by fast function evaluations, which enable the Bayesian approach to scatterometry. We introduce the nearest neighbor interpolation, the response surface methodology and a method based on a polynomial chaos expansion. For every surrogate model, we discuss the approximation error and the convergence. Moreover, we apply Markov Chain Monte Carlo sampling to determine desired geometry parameters, and its uncertainties form simulated measurement values based on Bayesian inference. We show that the surrogate model involving polynomial chaos is the most effective.
Sebastian
Heidenreich
Physikalisch-Technische Bundesanstalt, Abbestr 2-12, 10587 Berlin
Hermann
Gross
Physikalisch-Technische Bundesanstalt, Abbestr 2-12, 10587 Berlin, Germany
Markus
Bar
Physikalisch-Technische Bundesanstalt, Abbestr 2-12, 10587 Berlin, Germany
511-526
A BLOCK CIRCULANT EMBEDDING METHOD FOR SIMULATION OF STATIONARY GAUSSIAN RANDOM FIELDS ON BLOCK-REGULAR GRIDS
We propose a new method for sampling from a stationary Gaussian random field on a grid which is not regular but has a regular block structure, which is often the case in applications. The introduced block-circulant embedding method (BCEM) can outperform the classical circulant embedding method (CEM), which requires a regularization of the irregular grid before its application. Comparison of BCEM vs CEM is performed on typical model problems.
Min Ho
Park
School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, United Kingdom
M.V.
Tretyakov
School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, United Kingdom
527-544
OPTIMAL SENSOR PLACEMENT FOR THE ESTIMATION OF TURBULENCE MODEL PARAMETERS IN CFD
The optimal placement of sensors for the estimation of turbulence model parameters in computational fluid dynamics is presented. The information entropy (IE), applied on the posterior uncertainty of the model parameters inferred from Bayesian analysis, is used as a scalar measure of uncertainty. Using an asymptotic approximation, the IE depends on nominal values of the CFD model and prediction error model parameters. It is derived from the sensitivities of the flow quantities predicted by the flow model with respect to the model parameters. A stochastic optimization algorithm is used to perform the minimization of the IE in the continuous design space. Robustness to uncertainties in the nominal model parameters and flow conditions is addressed. Information redundancy due to sensor clustering is addressed by introducing spatially correlated prediction error models. The algorithm is applied to the turbulent flow through a backward-facing step where the optimal locations of velocity and Reynolds shear stress profiles of sensors are sought for the estimation of the parameters of the Spalart-Allmaras turbulence model.
Dimitrios I.
Papadimitriou
Department of Mechanical Engineering, University of Thessaly, Pedion Areos 38334, Volos, Greece
Costas
Papadimitriou
Department of Mechanical Engineering, University of Thessaly, Pedion Areos 38334, Volos, Greece
545-568
ROBUSTNESS OF WILKS' CONSERVATIVE ESTIMATE OF CONFIDENCE INTERVALS
The striking generality and simplicity of Wilks' method has made it popular for quantifying modeling uncertainty. A conservative estimate of the confidence interval is obtained from a very limited set of randomly drawn model sample values, with probability set by the assigned so-called stability. In contrast, the reproducibility of the estimated limits, or robustness, is beyond our control as it is strongly dependent on the probability distribution of model results. The inherent combination of random sampling and faithful estimation in Wilks' approach is here shown to often result in poor robustness. The estimated confidence interval is consequently not a well-defined measure of modeling uncertainty. To remedy this deficiency, adjustments of Wilks' approach as well as alternative novel, effective but less known approaches based on deterministic sampling are suggested. For illustration, the robustness of Wilks' estimate for uniform and normal model distributions are compared.
Jan Peter
Hessling
SP Technical Research Institute of Sweden, Measurement Technology, Box 857, SE-50115 Boras, Sweden
Jeffrey
Uhlmann
University of Missouri−Columbia, Department of Computer Science, 201 EBW, Columbia, Missouri 65211, USA
569-583
TABLE OF CONTENTS FOR VOLUME 5
585-589