Dodd, Daniel and Sharrock, Louis and Nemeth, Christopher (2024) Learning-Rate-Free Stochastic Optimization over Riemannian Manifolds. Proceedings of Machine Learning Research. ISSN 1938-7228 (In Press)
Available under License Creative Commons Attribution.
Download (0B)
Available under License Creative Commons Attribution.
Download (0B)
Available under License Creative Commons Attribution.
Download (0B)
Available under License Creative Commons Attribution.
Download (0B)
Available under License Creative Commons Attribution.
Download (0B)
riemannian_dog_icml.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (8MB)
Abstract
In recent years, interest in gradient-based optimization over Riemannian manifolds has surged. However, a significant challenge lies in the reliance on hyperparameters, especially the learning rate, which requires meticulous tuning by practitioners to ensure convergence at a suitable rate. In this work, we introduce innovative learning- rate-free algorithms for stochastic optimization over Riemannian manifolds, eliminating the need for hand-tuning and providing a more robust and user-friendly approach. We establish high probability convergence guarantees that are optimal, up to logarithmic factors, compared to the best-known optimally tuned rate in the deterministic setting. Our approach is validated through numerical experiments, demonstrating competitive performance against learning-rate-dependent algorithms.