Breyer, L. A. and Roberts, G. O. (2000) *From Metropolis to diffusions: Gibbs states and optimal scaling.* Stochastic Processes and their Applications, 90 (2). pp. 181-206.

## Abstract

This paper investigates the behaviour of the random walk Metropolis algorithm in high-dimensional problems. Here we concentrate on the case where the components in the target density is a spatially homogeneous Gibbs distribution with finite range. The performance of the algorithm is strongly linked to the presence or absence of phase transition for the Gibbs distribution; the convergence time being approximately linear in dimension for problems where phase transition is not present. Related to this, there is an optimal way to scale the variance of the proposal distribution in order to maximise the speed of convergence of the algorithm. This turns out to involve scaling the variance of the proposal as the reciprocal of dimension (at least in the phase transition-free case). Moreover, the actual optimal scaling can be characterised in terms of the overall acceptance rate of the algorithm, the maximising value being 0.234, the value as predicted by studies on simpler classes of target density. The results are proved in the framework of a weak convergence result, which shows that the algorithm actually behaves like an infinite-dimensional diffusion process in high dimensions.

Item Type: | Journal Article |
---|---|

Journal or Publication Title: | Stochastic Processes and their Applications |

Uncontrolled Keywords: | Markov chain Monte Carlo ; Hamiltonians ; Hybrid algorithms |

Subjects: | ?? qa ?? |

Departments: | Faculty of Science and Technology > Mathematics and Statistics Faculty of Science and Technology > Lancaster Environment Centre |

ID Code: | 19331 |

Deposited By: | ep_ss_importer |

Deposited On: | 21 Nov 2008 10:08 |

Refereed?: | Yes |

Published?: | Published |

Last Modified: | 17 Jul 2018 02:27 |

Identification Number: | |

URI: | http://eprints.lancs.ac.uk/id/eprint/19331 |

### Actions (login required)

View Item |