|国家预印本平台
首页|Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning

Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning

Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning

来源:Arxiv_logoArxiv
英文摘要

Few-shot class incremental learning (FSCIL) enables the continual learning of new concepts with only a few training examples. In FSCIL, the model undergoes substantial updates, making it prone to forgetting previous concepts and overfitting to the limited new examples. Most recent trend is typically to disentangle the learning of the representation from the classification head of the model. A well-generalized feature extractor on the base classes (many examples and many classes) is learned, and then fixed during incremental learning. Arguing that the fixed feature extractor restricts the model's adaptability to new classes, we introduce a novel FSCIL method to effectively address catastrophic forgetting and overfitting issues. Our method enables to seamlessly update the entire model with a few examples. We mainly propose a tripartite weight-space ensemble (Tri-WE). Tri-WE interpolates the base, immediately previous, and current models in weight-space, especially for the classification heads of the models. Then, it collaboratively maintains knowledge from the base and previous models. In addition, we recognize the challenges of distilling generalized representations from the previous model from scarce data. Hence, we suggest a regularization loss term using amplified data knowledge distillation. Simply intermixing the few-shot data, we can produce richer data enabling the distillation of critical knowledge from the previous model. Consequently, we attain state-of-the-art results on the miniImageNet, CUB200, and CIFAR100 datasets.

Juntae Lee、Munawar Hayat、Sungrack Yun

计算技术、计算机技术

Juntae Lee,Munawar Hayat,Sungrack Yun.Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning[EB/OL].(2025-06-04)[2025-07-23].https://arxiv.org/abs/2506.15720.点此复制

评论