Riemannian Inexact Gradient Descent for Quadratic Discrimination
Riemannian Inexact Gradient Descent for Quadratic Discrimination
We propose an inexact optimization algorithm on Riemannian manifolds, motivated by quadratic discrimination tasks in high-dimensional, low-sample-size (HDLSS) imaging settings. In such applications, gradient evaluations are often biased due to limited sample sizes. To address this, we introduce a novel Riemannian optimization algorithm that is robust to inexact gradient information and prove an $\mathcal O(1/K)$ convergence rate under standard assumptions. We also present a line search variant that requires access to function values but not exact gradients, maintaining the same convergence rate and ensuring sufficient descent. The algorithm is tailored to the Grassmann manifold by leveraging its geometric structure, and its convergence rate is validated numerically. A simulation of heteroscedastic images shows that when bias is introduced into the problem, both intentionally and through estimation of the covariance matrix, the detection performance of the algorithm solution is comparable to when true gradients are used in the optimization. The optimal subspace learned via the algorithm encodes interpretable patterns and shows qualitative similarity to known optimal solutions. By ensuring robust convergence and interpretability, our algorithm offers a compelling tool for manifold-based dimensionality reduction and discrimination in high-dimensional image data settings.
Uday Talwar、Meredith K. Kupinski、Afrooz Jalilzadeh
数学计算技术、计算机技术
Uday Talwar,Meredith K. Kupinski,Afrooz Jalilzadeh.Riemannian Inexact Gradient Descent for Quadratic Discrimination[EB/OL].(2025-07-07)[2025-07-23].https://arxiv.org/abs/2507.04670.点此复制
评论