Multiclass Loss Geometry Matters for Generalization of Gradient Descent in Separable Classification
Multiclass Loss Geometry Matters for Generalization of Gradient Descent in Separable Classification
We study the generalization performance of unregularized gradient methods for separable linear classification. While previous work mostly deal with the binary case, we focus on the multiclass setting with $k$ classes and establish novel population risk bounds for Gradient Descent for loss functions that decay to zero. In this setting, we show risk bounds that reveal that convergence rates are crucially influenced by the geometry of the loss template, as formalized by Wang and Scott (2024), rather than of the loss function itself. Particularly, we establish risk upper bounds that holds for any decay rate of the loss whose template is smooth with respect to the $p$-norm. In the case of exponentially decaying losses, our results indicates a contrast between the $p=\infty$ case, where the risk exhibits a logarithmic dependence on $k$, and $p=2$ where the risk scales linearly with $k$. To establish this separation formally, we also prove a lower bound in the latter scenario, demonstrating that the polynomial dependence on $k$ is unavoidable. Central to our analysis is a novel bound on the Rademacher complexity of low-noise vector-valued linear predictors with a loss template smooth w.r.t.~general $p$-norms.
Matan Schliserman、Tomer Koren
计算技术、计算机技术
Matan Schliserman,Tomer Koren.Multiclass Loss Geometry Matters for Generalization of Gradient Descent in Separable Classification[EB/OL].(2025-05-28)[2025-06-06].https://arxiv.org/abs/2505.22359.点此复制
评论