|国家预印本平台
首页|Improving the Adaptive Moment Estimation (ADAM) stochastic optimizer through an Implicit-Explicit (IMEX) time-stepping approach

Improving the Adaptive Moment Estimation (ADAM) stochastic optimizer through an Implicit-Explicit (IMEX) time-stepping approach

Improving the Adaptive Moment Estimation (ADAM) stochastic optimizer through an Implicit-Explicit (IMEX) time-stepping approach

来源:Arxiv_logoArxiv
英文摘要

The Adam optimizer, often used in Machine Learning for neural network training, corresponds to an underlying ordinary differential equation (ODE) in the limit of very small learning rates. This work shows that the classical Adam algorithm is a first-order implicit-explicit (IMEX) Euler discretization of the underlying ODE. Employing the time discretization point of view, we propose new extensions of the Adam scheme obtained by using higher-order IMEX methods to solve the ODE. Based on this approach, we derive a new optimization algorithm for neural network training that performs better than classical Adam on several regression and classification problems.

Andrey A. Popov、Abhinab Bhattacharjee、Adrian Sandu、Arash Sarshar

10.1615/JMachLearnModelComput.2024053508

计算技术、计算机技术

Andrey A. Popov,Abhinab Bhattacharjee,Adrian Sandu,Arash Sarshar.Improving the Adaptive Moment Estimation (ADAM) stochastic optimizer through an Implicit-Explicit (IMEX) time-stepping approach[EB/OL].(2024-03-20)[2025-07-16].https://arxiv.org/abs/2403.13704.点此复制

评论