On sparse variational methods and the Kullback-Leibler divergence between stochastic processes

Matthews, Alexander G. de G. and Hensman, James and Turner, Richard and Ghahramani, Zoubin (2016) On sparse variational methods and the Kullback-Leibler divergence between stochastic processes. Journal of Machine Learning Research, 51. pp. 231-239. ISSN 1532-4435

[thumbnail of sparseKL_AISTATS]
Preview
PDF (sparseKL_AISTATS)
sparseKL_AISTATS.pdf - Accepted Version
Available under License Creative Commons Attribution-NonCommercial.

Download (267kB)

Abstract

The variational framework for learning inducing variables (Titsias, 2009a) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approximating and posterior processes. To our knowledge this connection has thus far gone unremarked in the literature. In this paper we give a substantial generalization of the literature on this topic. We give a new proof of the result for infinite index sets which allows inducing points that are not data points and likelihoods that depend on all function values. We then discuss augmented index sets and show that, contrary to previous works, marginal consistency of augmentation is not enough to guarantee consistency of variational inference with the original model. We then characterize an extra condition where such a guarantee is obtainable. Finally we show how our framework sheds light on interdomain sparse approximations and sparse approximations for Cox processes.

Item Type:
Journal Article
Journal or Publication Title:
Journal of Machine Learning Research
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/2200/2207
Subjects:
?? ARTIFICIAL INTELLIGENCESOFTWARESTATISTICS AND PROBABILITYCONTROL AND SYSTEMS ENGINEERING ??
ID Code:
83543
Deposited By:
Deposited On:
14 Dec 2016 09:06
Refereed?:
Yes
Published?:
Published
Last Modified:
21 Sep 2023 02:10