|国家预印本平台
首页|Boosting Domain Incremental Learning: Selecting the Optimal Parameters is All You Need

Boosting Domain Incremental Learning: Selecting the Optimal Parameters is All You Need

Boosting Domain Incremental Learning: Selecting the Optimal Parameters is All You Need

来源:Arxiv_logoArxiv
英文摘要

Deep neural networks (DNNs) often underperform in real-world, dynamic settings where data distributions change over time. Domain Incremental Learning (DIL) offers a solution by enabling continual model adaptation, with Parameter-Isolation DIL (PIDIL) emerging as a promising paradigm to reduce knowledge conflicts. However, existing PIDIL methods struggle with parameter selection accuracy, especially as the number of domains and corresponding classes grows. To address this, we propose SOYO, a lightweight framework that improves domain selection in PIDIL. SOYO introduces a Gaussian Mixture Compressor (GMC) and Domain Feature Resampler (DFR) to store and balance prior domain data efficiently, while a Multi-level Domain Feature Fusion Network (MDFN) enhances domain feature extraction. Our framework supports multiple Parameter-Efficient Fine-Tuning (PEFT) methods and is validated across tasks such as image classification, object detection, and speech enhancement. Experimental results on six benchmarks demonstrate SOYO's consistent superiority over existing baselines, showcasing its robustness and adaptability in complex, evolving environments. The codes will be released in https://github.com/qwangcv/SOYO.

Qiang Wang、Xiang Song、Yuhang He、Jizhou Han、Chenhao Ding、Xinyuan Gao、Yihong Gong

计算技术、计算机技术

Qiang Wang,Xiang Song,Yuhang He,Jizhou Han,Chenhao Ding,Xinyuan Gao,Yihong Gong.Boosting Domain Incremental Learning: Selecting the Optimal Parameters is All You Need[EB/OL].(2025-05-29)[2025-06-14].https://arxiv.org/abs/2505.23744.点此复制

评论