LODAP: On-Device Incremental Learning Via Lightweight Operations and Data Pruning
LODAP: On-Device Incremental Learning Via Lightweight Operations and Data Pruning
Incremental learning that learns new classes over time after the model's deployment is becoming increasingly crucial, particularly for industrial edge systems, where it is difficult to communicate with a remote server to conduct computation-intensive learning. As more classes are expected to learn after their execution for edge devices. In this paper, we propose LODAP, a new on-device incremental learning framework for edge systems. The key part of LODAP is a new module, namely Efficient Incremental Module (EIM). EIM is composed of normal convolutions and lightweight operations. During incremental learning, EIM exploits some lightweight operations, called adapters, to effectively and efficiently learn features for new classes so that it can improve the accuracy of incremental learning while reducing model complexity as well as training overhead. The efficiency of LODAP is further enhanced by a data pruning strategy that significantly reduces the training data, thereby lowering the training overhead. We conducted extensive experiments on the CIFAR-100 and Tiny- ImageNet datasets. Experimental results show that LODAP improves the accuracy by up to 4.32\% over existing methods while reducing around 50\% of model complexity. In addition, evaluations on real edge systems demonstrate its applicability for on-device machine learning. The code is available at https://github.com/duanbiqing/LODAP.
Biqing Duan、Qing Wang、Di Liu、Wei Zhou、Zhenli He、Shengfa Miao
计算技术、计算机技术
Biqing Duan,Qing Wang,Di Liu,Wei Zhou,Zhenli He,Shengfa Miao.LODAP: On-Device Incremental Learning Via Lightweight Operations and Data Pruning[EB/OL].(2025-04-28)[2025-05-05].https://arxiv.org/abs/2504.19638.点此复制
评论