|国家预印本平台
首页|Faster Low-Rank Approximation and Kernel Ridge Regression via the Block-Nyström Method

Faster Low-Rank Approximation and Kernel Ridge Regression via the Block-Nyström Method

Faster Low-Rank Approximation and Kernel Ridge Regression via the Block-Nyström Method

来源:Arxiv_logoArxiv
英文摘要

The Nyström method is a popular low-rank approximation technique for large matrices that arise in kernel methods and convex optimization. Yet, when the data exhibits heavy-tailed spectral decay, the effective dimension of the problem often becomes so large that even the Nyström method may be outside of our computational budget. To address this, we propose Block-Nyström, an algorithm that injects a block-diagonal structure into the Nyström method, thereby significantly reducing its computational cost while recovering strong approximation guarantees. We show that Block-Nyström can be used to construct improved preconditioners for second-order optimization, as well as to efficiently solve kernel ridge regression for statistical learning over Hilbert spaces. Our key technical insight is that, within the same computational budget, combining several smaller Nyström approximations leads to stronger tail estimates of the input spectrum than using one larger approximation. Along the way, we provide a novel recursive preconditioning scheme for efficiently inverting the Block-Nyström matrix, and provide new statistical learning bounds for a broad class of approximate kernel ridge regression solvers.

Sachin Garg、Michał Dereziński

计算技术、计算机技术

Sachin Garg,Michał Dereziński.Faster Low-Rank Approximation and Kernel Ridge Regression via the Block-Nyström Method[EB/OL].(2025-07-19)[2025-08-02].https://arxiv.org/abs/2506.17556.点此复制

评论