|国家预印本平台
首页|GPS: Distilling Compact Memories via Grid-based Patch Sampling for Efficient Online Class-Incremental Learning

GPS: Distilling Compact Memories via Grid-based Patch Sampling for Efficient Online Class-Incremental Learning

GPS: Distilling Compact Memories via Grid-based Patch Sampling for Efficient Online Class-Incremental Learning

来源:Arxiv_logoArxiv
英文摘要

Online class-incremental learning aims to enable models to continuously adapt to new classes with limited access to past data, while mitigating catastrophic forgetting. Replay-based methods address this by maintaining a small memory buffer of previous samples, achieving competitive performance. For effective replay under constrained storage, recent approaches leverage distilled data to enhance the informativeness of memory. However, such approaches often involve significant computational overhead due to the use of bi-level optimization. Motivated by these limitations, we introduce Grid-based Patch Sampling (GPS), a lightweight and effective strategy for distilling informative memory samples without relying on a trainable model. GPS generates informative samples by sampling a subset of pixels from the original image, yielding compact low-resolution representations that preserve both semantic content and structural information. During replay, these representations are reassembled to support training and evaluation. Experiments on extensive benchmarks demonstrate that GRS can be seamlessly integrated into existing replay frameworks, leading to 3%-4% improvements in average end accuracy under memory-constrained settings, with limited computational overhead.

Mingchuan Ma、Yuhao Zhou、Jindi Lv、Yuxin Tian、Dan Si、Shujian Li、Qing Ye、Jiancheng Lv

计算技术、计算机技术

Mingchuan Ma,Yuhao Zhou,Jindi Lv,Yuxin Tian,Dan Si,Shujian Li,Qing Ye,Jiancheng Lv.GPS: Distilling Compact Memories via Grid-based Patch Sampling for Efficient Online Class-Incremental Learning[EB/OL].(2025-04-14)[2025-06-07].https://arxiv.org/abs/2504.10409.点此复制

评论