GLAMP: An Approximate Message Passing Framework for Transfer Learning with Applications to Lasso-based Estimators
GLAMP: An Approximate Message Passing Framework for Transfer Learning with Applications to Lasso-based Estimators
Approximate Message Passing (AMP) algorithms enable precise characterization of certain classes of random objects in the high-dimensional limit, and have found widespread applications in fields such as statistics, deep learning, genetics, and communications. However, existing AMP frameworks cannot simultaneously handle matrix-valued iterates and non-separable denoising functions. This limitation prevents them from precisely characterizing estimators that draw information from multiple data sources with distribution shifts. In this work, we introduce Generalized Long Approximate Message Passing (GLAMP), a novel extension of AMP that addresses this limitation. We rigorously prove state evolution for GLAMP. GLAMP significantly broadens the scope of AMP, enabling the analysis of transfer learning estimators that were previously out of reach. We demonstrate the utility of GLAMP by precisely characterizing the risk of three Lasso-based transfer learning estimators: the Stacked Lasso, the Model Averaging Estimator, and the Second Step Estimator. We also demonstrate the remarkable finite sample accuracy of our theory via extensive simulations.
Longlin Wang、Yanke Song、Kuanhao Jiang、Pragya Sur
计算技术、计算机技术
Longlin Wang,Yanke Song,Kuanhao Jiang,Pragya Sur.GLAMP: An Approximate Message Passing Framework for Transfer Learning with Applications to Lasso-based Estimators[EB/OL].(2025-05-28)[2025-06-12].https://arxiv.org/abs/2505.22594.点此复制
评论