|国家预印本平台
首页|Orthogonal Projection Subspace to Aggregate Online Prior-knowledge for Continual Test-time Adaptation

Orthogonal Projection Subspace to Aggregate Online Prior-knowledge for Continual Test-time Adaptation

Orthogonal Projection Subspace to Aggregate Online Prior-knowledge for Continual Test-time Adaptation

来源:Arxiv_logoArxiv
英文摘要

Continual Test Time Adaptation (CTTA) is a task that requires a source pre-trained model to continually adapt to new scenarios with changing target distributions. Existing CTTA methods primarily focus on mitigating the challenges of catastrophic forgetting and error accumulation. Though there have been emerging methods based on forgetting adaptation with parameter-efficient fine-tuning, they still struggle to balance competitive performance and efficient model adaptation, particularly in complex tasks like semantic segmentation. In this paper, to tackle the above issues, we propose a novel pipeline, Orthogonal Projection Subspace to aggregate online Prior-knowledge, dubbed OoPk. Specifically, we first project a tuning subspace orthogonally which allows the model to adapt to new domains while preserving the knowledge integrity of the pre-trained source model to alleviate catastrophic forgetting. Then, we elaborate an online prior-knowledge aggregation strategy that employs an aggressive yet efficient image masking strategy to mimic potential target dynamism, enhancing the student model's domain adaptability. This further gradually ameliorates the teacher model's knowledge, ensuring high-quality pseudo labels and reducing error accumulation. We demonstrate our method with extensive experiments that surpass previous CTTA methods and achieve competitive performances across various continual TTA benchmarks in semantic segmentation tasks.

Jinlong Li、Dong Zhao、Qi Zang、Zequn Jie、Lin Ma、Nicu Sebe

计算技术、计算机技术

Jinlong Li,Dong Zhao,Qi Zang,Zequn Jie,Lin Ma,Nicu Sebe.Orthogonal Projection Subspace to Aggregate Online Prior-knowledge for Continual Test-time Adaptation[EB/OL].(2025-06-23)[2025-07-21].https://arxiv.org/abs/2506.19022.点此复制

评论