|国家预印本平台
首页|Stochastic Non-convex Optimization with Strong High Probability Second-order Convergence

Stochastic Non-convex Optimization with Strong High Probability Second-order Convergence

Stochastic Non-convex Optimization with Strong High Probability Second-order Convergence

来源:Arxiv_logoArxiv
英文摘要

In this paper, we study stochastic non-convex optimization with non-convex random functions. Recent studies on non-convex optimization revolve around establishing second-order convergence, i.e., converging to a nearly second-order optimal stationary points. However, existing results on stochastic non-convex optimization are limited, especially with a high probability second-order convergence. We propose a novel updating step (named NCG-S) by leveraging a stochastic gradient and a noisy negative curvature of a stochastic Hessian, where the stochastic gradient and Hessian are based on a proper mini-batch of random functions. Building on this step, we develop two algorithms and establish their high probability second-order convergence. To the best of our knowledge, the proposed stochastic algorithms are the first with a second-order convergence in {\it high probability} and a time complexity that is {\it almost linear} in the problem's dimensionality.

Mingrui Liu、Tianbao Yang

计算技术、计算机技术

Mingrui Liu,Tianbao Yang.Stochastic Non-convex Optimization with Strong High Probability Second-order Convergence[EB/OL].(2017-10-25)[2025-08-02].https://arxiv.org/abs/1710.09447.点此复制

评论