Kullback-Leibler excess risk bounds for exponential weighted aggregation in Generalized linear models
Kullback-Leibler excess risk bounds for exponential weighted aggregation in Generalized linear models
Aggregation methods have emerged as a powerful and flexible framework in statistical learning, providing unified solutions across diverse problems such as regression, classification, and density estimation. In the context of generalized linear models (GLMs), where responses follow exponential family distributions, aggregation offers an attractive alternative to classical parametric modeling. This paper investigates the problem of sparse aggregation in GLMs, aiming to approximate the true parameter vector by a sparse linear combination of predictors. We prove that an exponential weighted aggregation scheme yields a sharp oracle inequality for the Kullback-Leibler risk with leading constant equal to one, while also attaining the minimax-optimal rate of aggregation. These results are further enhanced by establishing high-probability bounds on the excess risk.
The Tien Mai
数学
The Tien Mai.Kullback-Leibler excess risk bounds for exponential weighted aggregation in Generalized linear models[EB/OL].(2025-04-14)[2025-04-26].https://arxiv.org/abs/2504.10171.点此复制
评论