Learning-Rate-Free Stochastic Optimization over Riemannian Manifolds

Dodd, Daniel and Sharrock, Louis and Nemeth, Christopher (2024) Learning-Rate-Free Stochastic Optimization over Riemannian Manifolds. Proceedings of Machine Learning Research. ISSN 1938-7228 (In Press)

[thumbnail of riemannian_dog_icml]
Text (riemannian_dog_icml)
riemannian_dog_icml.pdf - Accepted Version
Available under License Creative Commons Attribution.

Download (8MB)

Abstract

In recent years, interest in gradient-based optimization over Riemannian manifolds has surged. However, a significant challenge lies in the reliance on hyperparameters, especially the learning rate, which requires meticulous tuning by practitioners to ensure convergence at a suitable rate. In this work, we introduce innovative learning- rate-free algorithms for stochastic optimization over Riemannian manifolds, eliminating the need for hand-tuning and providing a more robust and user-friendly approach. We establish high probability convergence guarantees that are optimal, up to logarithmic factors, compared to the best-known optimally tuned rate in the deterministic setting. Our approach is validated through numerical experiments, demonstrating competitive performance against learning-rate-dependent algorithms.

Item Type:
Journal Article
Journal or Publication Title:
Proceedings of Machine Learning Research
Additional Information:
In: Proceedings of the 41st International Conference on Machine Learning (ICML), Vienna, Austria.
Uncontrolled Keywords:
Data Sharing Template/yes
Subjects:
?? yes ??
ID Code:
221030
Deposited By:
Deposited On:
05 Jun 2024 15:30
Refereed?:
Yes
Published?:
In Press
Last Modified:
14 Nov 2024 01:30