A geometric framework for momentum-based optimizers for low-rank training
A geometric framework for momentum-based optimizers for low-rank training
Low-rank pre-training and fine-tuning have recently emerged as promising techniques for reducing the computational and storage costs of large neural networks. Training low-rank parameterizations typically relies on conventional optimizers such as heavy ball momentum methods or Adam. In this work, we identify and analyze potential difficulties that these training methods encounter when used to train low-rank parameterizations of weights. In particular, we show that classical momentum methods can struggle to converge to a local optimum due to the geometry of the underlying optimization landscape. To address this, we introduce novel training strategies derived from dynamical low-rank approximation, which explicitly account for the underlying geometric structure. Our approach leverages and combines tools from dynamical low-rank approximation and momentum-based optimization to design optimizers that respect the intrinsic geometry of the parameter space. We validate our methods through numerical experiments, demonstrating faster convergence, and stronger validation metrics at given parameter budgets.
Steffen Schotth??fer、Timon Klein、Jonas Kusch
计算技术、计算机技术
Steffen Schotth??fer,Timon Klein,Jonas Kusch.A geometric framework for momentum-based optimizers for low-rank training[EB/OL].(2025-06-20)[2025-07-16].https://arxiv.org/abs/2506.17475.点此复制
评论