Stochastic gradient Markov chain Monte Carlo

Nemeth, Christopher and Fearnhead, Paul (2020) Stochastic gradient Markov chain Monte Carlo. Journal of the American Statistical Association. ISSN 0162-1459 (In Press)

[img]
Text (accepted_version)
sgmcmc_unblinded.pdf - Accepted Version
Restricted to Repository staff only until 1 January 2050.
Available under License Creative Commons Attribution-NonCommercial.

Download (4MB)

Abstract

Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for Bayesian inference. They are theoretically well-understood and conceptually simple to apply in practice. The drawback of MCMC is that performing exact inference generally requires all of the data to be processed at each iteration of the algorithm. For large data sets, the computational cost of MCMC can be prohibitive, which has led to recent developments in scalable Monte Carlo algorithms that have a significantly lower computational cost than standard MCMC. In this paper, we focus on a particular class of scalable Monte Carlo algorithms, stochastic gradient Markov chain Monte Carlo (SGMCMC) which utilises data subsampling techniques to reduce the per iteration cost of MCMC. We provide an introduction to some popular SGMCMC algorithms and review the supporting theoretical results, as well as comparing the efficiency of SGMCMC algorithms against MCMC on benchmark examples. The supporting R code is available online.

Item Type:
Journal Article
Journal or Publication Title:
Journal of the American Statistical Association
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/1800/1804
Subjects:
ID Code:
148741
Deposited By:
Deposited On:
02 Nov 2020 12:10
Refereed?:
Yes
Published?:
In Press
Last Modified:
07 Nov 2020 07:11