SEAL: Searching Expandable Architectures for Incremental Learning
SEAL: Searching Expandable Architectures for Incremental Learning
Incremental learning is a machine learning paradigm where a model learns from a sequential stream of tasks. This setting poses a key challenge: balancing plasticity (learning new tasks) and stability (preserving past knowledge). Neural Architecture Search (NAS), a branch of AutoML, automates the design of the architecture of Deep Neural Networks and has shown success in static settings. However, existing NAS-based approaches to incremental learning often rely on expanding the model at every task, making them impractical in resource-constrained environments. In this work, we introduce SEAL, a NAS-based framework tailored for data-incremental learning, a scenario where disjoint data samples arrive sequentially and are not stored for future access. SEAL adapts the model structure dynamically by expanding it only when necessary, based on a capacity estimation metric. Stability is preserved through cross-distillation training after each expansion step. The NAS component jointly searches for both the architecture and the optimal expansion policy. Experiments across multiple benchmarks demonstrate that SEAL effectively reduces forgetting and enhances accuracy while maintaining a lower model size compared to prior methods. These results highlight the promise of combining NAS and selective expansion for efficient, adaptive learning in incremental scenarios.
Matteo Gambella、Vicente Javier Castro Solar、Manuel Roveri
计算技术、计算机技术
Matteo Gambella,Vicente Javier Castro Solar,Manuel Roveri.SEAL: Searching Expandable Architectures for Incremental Learning[EB/OL].(2025-05-15)[2025-06-06].https://arxiv.org/abs/2505.10457.点此复制
评论