Preferential Subsampling for Stochastic Gradient Langevin Dynamics

Putcha, Srshti and Nemeth, Christopher and Fearnhead, Paul (2023) Preferential Subsampling for Stochastic Gradient Langevin Dynamics. Proceedings of Machine Learning Research, 206. pp. 8837-8856. ISSN 2640-3498

Full text not available from this repository.

Abstract

Stochastic gradient MCMC (SGMCMC) offers a scalable alternative to traditional MCMC, by constructing an unbiased estimate of the gradient of the log-posterior with a small, uniformly-weighted subsample of the data. While efficient to compute, the resulting gradient estimator may exhibit a high variance and impact sampler performance. The problem of variance control has been traditionally addressed by constructing a better stochastic gradient estimator, often using control variates. We propose to use a discrete, nonuniform probability distribution to preferentially subsample data points that have a greater impact on the stochastic gradient. In addition, we present a method of adaptively adjusting the subsample size at each iteration of the algorithm, so that we increase the subsample size in areas of the sample space where the gradient is harder to estimate. We demonstrate that such an approach can maintain the same level of accuracy while substantially reducing the average subsample size that is used.

Item Type:
Journal Article
Journal or Publication Title:
Proceedings of Machine Learning Research
Uncontrolled Keywords:
Research Output Funding/yes_externally_funded
Subjects:
?? yes - externally fundedyesartificial intelligencesoftwarecontrol and systems engineeringstatistics and probability ??
ID Code:
203529
Deposited By:
Deposited On:
27 Sep 2023 08:25
Refereed?:
Yes
Published?:
Published
Last Modified:
16 Jul 2024 00:09