Learning single-index models via harmonic decomposition
Learning single-index models via harmonic decomposition
We study the problem of learning single-index models, where the label $y \in \mathbb{R}$ depends on the input $\boldsymbol{x} \in \mathbb{R}^d$ only through an unknown one-dimensional projection $\langle \boldsymbol{w}_*,\boldsymbol{x}\rangle$. Prior work has shown that under Gaussian inputs, the statistical and computational complexity of recovering $\boldsymbol{w}_*$ is governed by the Hermite expansion of the link function. In this paper, we propose a new perspective: we argue that "spherical harmonics" -- rather than "Hermite polynomials" -- provide the natural basis for this problem, as they capture its intrinsic "rotational symmetry". Building on this insight, we characterize the complexity of learning single-index models under arbitrary spherically symmetric input distributions. We introduce two families of estimators -- based on tensor unfolding and online SGD -- that respectively achieve either optimal sample complexity or optimal runtime, and argue that estimators achieving both may not exist in general. When specialized to Gaussian inputs, our theory not only recovers and clarifies existing results but also reveals new phenomena that had previously been overlooked.
Nirmit Joshi、Hugo Koubbi、Theodor Misiakiewicz、Nathan Srebro
计算技术、计算机技术
Nirmit Joshi,Hugo Koubbi,Theodor Misiakiewicz,Nathan Srebro.Learning single-index models via harmonic decomposition[EB/OL].(2025-06-11)[2025-07-01].https://arxiv.org/abs/2506.09887.点此复制
评论