Papamarkou, Theodore and Skoularidou, Maria and Palla, Konstantina and Aitchison, Laurence and Arbel, Julyan and Dunson, David and Filippone, Maurizio and Fortuin, Vincent and Hennig, Philipp and Hernández-Lobato, José Miguel and Hubin, Aliaksandr and Immer, Alexander and Karaletsos, Theofanis and Khan, Mohammad Emtiyaz and Kristiadi, Agustinus and Li, Yingzhen and Mandt, Stephan and Nemeth, Christopher and Osborne, Michael A. and Rudner, Tim G. J. and Rügamer, David and Teh, Yee Whye and Welling, Max and Wilson, Andrew Gordon and Zhang, Ruqi (2024) Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI. Proceedings of Machine Learning Research. ISSN 1938-7228 (In Press)
ICML_2024_Position_Paper_On_Bayesian_Deep_Learning.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (510kB)
Abstract
In the current landscape of deep learning research, there is a predominant emphasis on achieving high predictive accuracy in supervised tasks involving large image and language datasets. However, a broader perspective reveals a multitude of overlooked metrics, tasks, and data types, such as uncertainty, active and continual learning, and scientific data, that demand attention. Bayesian deep learning (BDL) constitutes a promising avenue, offering advantages across these diverse settings. This paper posits that BDL can elevate the capabilities of deep learning. It revisits the strengths of BDL, acknowledges existing challenges, and highlights some exciting research avenues aimed at addressing these obstacles. Looking ahead, the discussion focuses on possible ways to combine large-scale foundation models with BDL to unlock their full potential.