Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations

We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or of numerical models embedded therein. Our approach introduces local approximations of these models into the Metropo...

Full description

Bibliographic Details
Main Authors: Conrad, Patrick R. (Contributor), Marzouk, Youssef M. (Contributor), Pillai, Natesh S. (Author), Smith, Aaron (Author)
Other Authors: Massachusetts Institute of Technology. Department of Aeronautics and Astronautics (Contributor)
Format: Article
Language:English
Published: American Statistical Association, 2015-11-20T12:41:58Z.
Subjects:
Online Access:Get fulltext
LEADER 02426 am a22002413u 4500
001 99937
042 |a dc 
100 1 0 |a Conrad, Patrick R.  |e author 
100 1 0 |a Massachusetts Institute of Technology. Department of Aeronautics and Astronautics  |e contributor 
100 1 0 |a Marzouk, Youssef M.  |e contributor 
100 1 0 |a Conrad, Patrick R.  |e contributor 
100 1 0 |a Marzouk, Youssef M.  |e contributor 
700 1 0 |a Marzouk, Youssef M.  |e author 
700 1 0 |a Pillai, Natesh S.  |e author 
700 1 0 |a Smith, Aaron  |e author 
245 0 0 |a Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations 
260 |b American Statistical Association,   |c 2015-11-20T12:41:58Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/99937 
520 |a We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or of numerical models embedded therein. Our approach introduces local approximations of these models into the Metropolis-Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. Previous efforts at integrating approximate models into inference typically sacrifice either the sampler's exactness or efficiency; our work seeks to address these limitations by exploiting useful convergence characteristics of local approximations. We prove the ergodicity of our approximate Markov chain, showing that it samples asymptotically from the exact posterior distribution of interest. We describe variations of the algorithm that employ either local polynomial approximations or local Gaussian process regressors. Our theoretical results reinforce the key observation underlying this paper: when the likelihood has some local regularity, the number of model evaluations per MCMC step can be greatly reduced without biasing the Monte Carlo average. Numerical experiments demonstrate multiple order-of-magnitude reductions in the number of forward model evaluations used in representative ODE and PDE inference problems, with both synthetic and real data. 
520 |a United States. Dept. of Energy. Office of Advanced Scientific Computing Research. Scientific Discovery through Advanced Computing Program (Award DE-SC0007099) 
546 |a en_US 
655 7 |a Article 
773 |t Journal of the American Statistical Association