|国家预印本平台
首页|Composite Optimization with Indicator Functions: Stationary Duality and a Semismooth Newton Method

Composite Optimization with Indicator Functions: Stationary Duality and a Semismooth Newton Method

Composite Optimization with Indicator Functions: Stationary Duality and a Semismooth Newton Method

来源:Arxiv_logoArxiv
英文摘要

Indicator functions of taking values of zero or one are essential to numerous applications in machine learning and statistics. The corresponding primal optimization model has been researched in several recent works. However, its dual problem is a more challenging topic that has not been well addressed. One possible reason is that the Fenchel conjugate of any indicator function is finite only at the origin. This work aims to explore the dual optimization for the sum of a strongly convex function and a composite term with indicator functions on positive intervals. For the first time, a dual problem is constructed by extending the classic conjugate subgradient property to the indicator function. This extension further helps us establish the equivalence between the primal and dual solutions. The dual problem turns out to be a sparse optimization with a $\ell_0$ regularizer and a nonnegative constraint. The proximal operator of the sparse regularizer is used to identify a dual subspace to implement gradient and/or semismooth Newton iteration with low computational complexity. This gives rise to a dual Newton-type method with both global convergence and local superlinear (or quadratic) convergence rate under mild conditions. Finally, when applied to AUC maximization and sparse multi-label classification, our dual Newton method demonstrates satisfactory performance on computational speed and accuracy.

Penghe Zhang、Naihua Xiu、Houduo Qi

计算技术、计算机技术

Penghe Zhang,Naihua Xiu,Houduo Qi.Composite Optimization with Indicator Functions: Stationary Duality and a Semismooth Newton Method[EB/OL].(2025-06-09)[2025-06-29].https://arxiv.org/abs/2506.08374.点此复制

评论