Asymptotic Distribution of Low-Dimensional Patterns Induced by Non-Differentiable Regularizers under General Loss Functions
Asymptotic Distribution of Low-Dimensional Patterns Induced by Non-Differentiable Regularizers under General Loss Functions
This article investigates the asymptotic distribution of penalized estimators with non-differentiable penalties designed to recover low-dimensional structures; that is, subspaces in which the true parameter lies. We study the asymptotic distribution of the scaled estimation error in the regime where the parameter dimension p is fixed and the number of observations n tends to infinity. Our focus is on the asymptotic probability of pattern recovery, a question not addressed by classical results for the LASSO. In our recent work, we derived such results for the LASSO and broader classes of penalties, including non-separable ones such as SLOPE, within the standard linear model. We now extend this analysis to general loss functions, including, for example, robust regression with Huber and quantile loss functions, as well as generalized linear models from the exponential family. The main contribution of the paper is the development of an asymptotic framework for pattern convergence of regularized M-estimators under general loss functions that satisfy a suitable stochastic differentiability condition. The proofs rely on tools from empirical process theory, including Donsker classes and VC-dimension techniques.
Ivan Hejny、Jonas Wallin、Ma?gorzata Bogdan
数学计算技术、计算机技术
Ivan Hejny,Jonas Wallin,Ma?gorzata Bogdan.Asymptotic Distribution of Low-Dimensional Patterns Induced by Non-Differentiable Regularizers under General Loss Functions[EB/OL].(2025-06-14)[2025-07-16].https://arxiv.org/abs/2506.12621.点此复制
评论