|国家预印本平台
首页|SASSHA: Sharpness-aware Adaptive Second-order Optimization with Stable Hessian Approximation

SASSHA: Sharpness-aware Adaptive Second-order Optimization with Stable Hessian Approximation

SASSHA: Sharpness-aware Adaptive Second-order Optimization with Stable Hessian Approximation

来源:Arxiv_logoArxiv
英文摘要

Approximate second-order optimization methods often exhibit poorer generalization compared to first-order approaches. In this work, we look into this issue through the lens of the loss landscape and find that existing second-order methods tend to converge to sharper minima compared to SGD. In response, we propose Sassha, a novel second-order method designed to enhance generalization by explicitly reducing sharpness of the solution, while stabilizing the computation of approximate Hessians along the optimization trajectory. In fact, this sharpness minimization scheme is crafted also to accommodate lazy Hessian updates, so as to secure efficiency besides flatness. To validate its effectiveness, we conduct a wide range of standard deep learning experiments where Sassha demonstrates its outstanding generalization performance that is comparable to, and mostly better than, other methods. We provide a comprehensive set of analyses including convergence, robustness, stability, efficiency, and cost.

Dahun Shin、Dongyeop Lee、Jinseok Chung、Namhoon Lee

计算技术、计算机技术

Dahun Shin,Dongyeop Lee,Jinseok Chung,Namhoon Lee.SASSHA: Sharpness-aware Adaptive Second-order Optimization with Stable Hessian Approximation[EB/OL].(2025-06-24)[2025-07-23].https://arxiv.org/abs/2502.18153.点此复制

评论