|国家预印本平台
首页|Low Rank Approximation at Sublinear Cost

Low Rank Approximation at Sublinear Cost

Low Rank Approximation at Sublinear Cost

来源:Arxiv_logoArxiv
英文摘要

Low Rank Approximation (LRA) of a matrix is a hot research subject, fundamental for Matrix and Tensor Computations and Big Data Mining and Analysis. Computations with low rank matrices can be performed at sublinear cost -- by using much fewer floating-point operations (flops) than an input matrix has entries, but can we compute LRA at sublinear cost? This is routinely done in computational practice for a large class of inputs, even though any sublinear cost LRA algorithm fails most miserably on worst case matrices. To provide insight into this controversy we first accelerate some popular near-optimal random sketching LRA algorithms -- to run them at sublinear cost. Then we define two probabilistic structures in the space of input matrices and estimate that the expected spectral and Frobenius error norms for the output LRA of the accelerated algorithms stay within a reasonable factor from their optima under both models, and so these sublinear cost algorithms only fail for a very narrow input class. Our upper estimates for their output accuracy are still quite high, but under some additional semi-heuristic amendments the algorithms have consistently output accurate LRA of various synthetic and real-world matrices in our numerical tests.

Qi Luan、Victor Y. Pan、John Svadlenka、Liang Zhao

计算技术、计算机技术

Qi Luan,Victor Y. Pan,John Svadlenka,Liang Zhao.Low Rank Approximation at Sublinear Cost[EB/OL].(2025-07-20)[2025-08-05].https://arxiv.org/abs/1906.04327.点此复制

评论