Library Subscription: Guest
Begell Digital Portal Begell Digital Library eBooks Journals References & Proceedings Research Collections
International Journal for Uncertainty Quantification

Impact factor: 1.000

ISSN Print: 2152-5080
ISSN Online: 2152-5099

Open Access

International Journal for Uncertainty Quantification

DOI: 10.1615/Int.J.UncertaintyQuantification.2012004713
pages 445-474

ASYMPTOTICALLY INDEPENDENT MARKOV SAMPLING: A NEW MARKOV CHAIN MONTE CARLO SCHEME FOR BAYESIAN INFERENCE

James L. Beck
Division of Engineering and Applied Science, California Institute of Technology, Pasadena, California 91125, USA
Konstantin M. Zuev
Department of Computing and Mathematical Sciences, Division of Engineering and Applied Science, 1200 E California Blvd., California Institute of Technology, Pasadena, California 91125, USA

ABSTRACT

In Bayesian inference, many problems can he expressed as the evaluation of the expectation of an uncertain quantity of interest with respect to the posterior distribution based on relevant data. Standard Monte Carlo method is often not applicable because the encountered posterior distributions cannot be sampled directly. In this case, the most popular strategies are the importance sampling method, Markov chain Monte Carlo, and annealing. In this paper, we introduce a new scheme for Bayesian inference, called asymptotically independent Markov sampling (AIMS), which is based on the above methods. We derive important ergodic properties of AIMS. In particular, it is shown that, under certain conditions, the AIMS algorithm produces a uniformly ergodic Markov chain. The choice of the free parameters of the algorithm is discussed and recommendations are provided for this choice, both theoretically and heuristically based. The efficiency of AIMS is demonstrated with three numerical examples, which include both multimodal and higher-dimensional target posterior distributions.