|国家预印本平台
首页|The Importance of Being Lazy: Scaling Limits of Continual Learning

The Importance of Being Lazy: Scaling Limits of Continual Learning

The Importance of Being Lazy: Scaling Limits of Continual Learning

来源:Arxiv_logoArxiv
英文摘要

Despite recent efforts, neural networks still struggle to learn in non-stationary environments, and our understanding of catastrophic forgetting (CF) is far from complete. In this work, we perform a systematic study on the impact of model scale and the degree of feature learning in continual learning. We reconcile existing contradictory observations on scale in the literature, by differentiating between lazy and rich training regimes through a variable parameterization of the architecture. We show that increasing model width is only beneficial when it reduces the amount of feature learning, yielding more laziness. Using the framework of dynamical mean field theory, we then study the infinite width dynamics of the model in the feature learning regime and characterize CF, extending prior theoretical results limited to the lazy regime. We study the intricate relationship between feature learning, task non-stationarity, and forgetting, finding that high feature learning is only beneficial with highly similar tasks. We identify a transition modulated by task similarity where the model exits an effectively lazy regime with low forgetting to enter a rich regime with significant forgetting. Finally, our findings reveal that neural networks achieve optimal performance at a critical level of feature learning, which depends on task non-stationarity and transfers across model scales. This work provides a unified perspective on the role of scale and feature learning in continual learning.

Jacopo Graldi、Alessandro Breccia、Giulia Lanzillotta、Thomas Hofmann、Lorenzo Noci

计算技术、计算机技术

Jacopo Graldi,Alessandro Breccia,Giulia Lanzillotta,Thomas Hofmann,Lorenzo Noci.The Importance of Being Lazy: Scaling Limits of Continual Learning[EB/OL].(2025-06-20)[2025-07-01].https://arxiv.org/abs/2506.16884.点此复制

评论