Gradient flow in the kernel learning problem
Gradient flow in the kernel learning problem
This is a sequel to our paper `On the kernel learning problem'. We identify a canonical choice of Riemannian gradient flow, to find the stationary points in the kernel learning problem. In the presence of Gaussian noise variables, this flow enjoys the remarkable property of having a continuous family of Lyapunov functionals, and the interpretation is the automatic reduction of noise. PS. We include an extensive discussion in the postcript explaining the comparison with the 2-layer neural networks. Readers looking for additional motivations are encouraged to read the postscript immediately following the introduction.
Yang Li、Feng Ruan
计算技术、计算机技术
Yang Li,Feng Ruan.Gradient flow in the kernel learning problem[EB/OL].(2025-06-10)[2025-06-27].https://arxiv.org/abs/2506.08550.点此复制
评论