Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning
Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning
Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Despite the strong performance of Pre-Trained Models (PTMs) in CIL, a critical issue persists: learning new classes often results in the overwriting of old ones. Excessive modification of the network causes forgetting, while minimal adjustments lead to an inadequate fit for new classes. As a result, it is desired to figure out a way of efficient model updating without harming former knowledge. In this paper, we propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL. To enable model updating without conflict, we train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces. These adapters span a high-dimensional feature space, enabling joint decision-making across multiple subspaces. As data evolves, the expanding subspaces render the old class classifiers incompatible with new-stage spaces. Correspondingly, we design a semantic-guided prototype complement strategy that synthesizes old classes' new features without using any old class instance. Extensive experiments on seven benchmark datasets verify EASE's state-of-the-art performance. Code is available at: https://github.com/sun-hailong/CVPR24-Ease
De-Chuan Zhan、Da-Wei Zhou、Hai-Long Sun、Han-Jia Ye
计算技术、计算机技术
De-Chuan Zhan,Da-Wei Zhou,Hai-Long Sun,Han-Jia Ye.Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning[EB/OL].(2024-03-18)[2025-08-02].https://arxiv.org/abs/2403.12030.点此复制
评论