Random feature approximation for general spectral methods
Random feature approximation for general spectral methods
Random feature approximation is arguably one of the most widely used techniques for kernel methods in large-scale learning algorithms. In this work, we analyze the generalization properties of random feature methods, extending previous results for Tikhonov regularization to a broad class of spectral regularization techniques. This includes not only explicit methods but also implicit schemes such as gradient descent and accelerated algorithms like the Heavy-Ball and Nesterov method. Through this framework, we enable a theoretical analysis of neural networks and neural operators through the lens of the Neural Tangent Kernel (NTK) approach trained via gradient descent. For our estimators we obtain optimal learning rates over regularity classes (even for classes that are not included in the reproducing kernel Hilbert space), which are defined through appropriate source conditions. This improves or completes previous results obtained in related settings for specific kernel algorithms.
Mike Nguyen、Nicole M??cke
计算技术、计算机技术
Mike Nguyen,Nicole M??cke.Random feature approximation for general spectral methods[EB/OL].(2025-06-19)[2025-07-16].https://arxiv.org/abs/2506.16283.点此复制
评论