|国家预印本平台
首页|Hamiltonian Descent Algorithms for Optimization: Accelerated Rates via Randomized Integration Time

Hamiltonian Descent Algorithms for Optimization: Accelerated Rates via Randomized Integration Time

Hamiltonian Descent Algorithms for Optimization: Accelerated Rates via Randomized Integration Time

来源:Arxiv_logoArxiv
英文摘要

We study the Hamiltonian flow for optimization (HF-opt), which simulates the Hamiltonian dynamics for some integration time and resets the velocity to $0$ to decrease the objective function; this is the optimization analogue of the Hamiltonian Monte Carlo algorithm for sampling. For short integration time, HF-opt has the same convergence rates as gradient descent for minimizing strongly and weakly convex functions. We show that by randomizing the integration time in HF-opt, the resulting randomized Hamiltonian flow (RHF) achieves accelerated convergence rates in continuous time, similar to the rates for the accelerated gradient flow. We study a discrete-time implementation of RHF as the randomized Hamiltonian gradient descent (RHGD) algorithm. We prove that RHGD achieves the same accelerated convergence rates as Nesterov's accelerated gradient descent (AGD) for minimizing smooth strongly and weakly convex functions. We provide numerical experiments to demonstrate that RHGD is competitive with classical accelerated methods such as AGD across all settings and outperforms them in certain regimes.

Qiang Fu、Andre Wibisono

计算技术、计算机技术

Qiang Fu,Andre Wibisono.Hamiltonian Descent Algorithms for Optimization: Accelerated Rates via Randomized Integration Time[EB/OL].(2025-05-18)[2025-07-01].https://arxiv.org/abs/2505.12553.点此复制

评论