|国家预印本平台
首页|Beyond Adapter Retrieval: Latent Geometry-Preserving Composition via Sparse Task Projection

Beyond Adapter Retrieval: Latent Geometry-Preserving Composition via Sparse Task Projection

Beyond Adapter Retrieval: Latent Geometry-Preserving Composition via Sparse Task Projection

来源:Arxiv_logoArxiv
英文摘要

Recent advances in parameter-efficient transfer learning have demonstrated the utility of composing LoRA adapters from libraries of pretrained modules. However, most existing approaches rely on simple retrieval heuristics or uniform averaging, which overlook the latent structure of task relationships in representation space. We propose a new framework for adapter reuse that moves beyond retrieval, formulating adapter composition as a geometry-aware sparse reconstruction problem. Specifically, we represent each task by a latent prototype vector derived from the base model's encoder and aim to approximate the target task prototype as a sparse linear combination of retrieved reference prototypes, under an $\ell_1$-regularized optimization objective. The resulting combination weights are then used to blend the corresponding LoRA adapters, yielding a composite adapter tailored to the target task. This formulation not only preserves the local geometric structure of the task representation manifold, but also promotes interpretability and efficient reuse by selecting a minimal set of relevant adapters. We demonstrate the effectiveness of our approach across multiple domains-including medical image segmentation, medical report generation and image synthesis. Our results highlight the benefit of coupling retrieval with latent geometry-aware optimization for improved zero-shot generalization.

Pengfei Jin、Peng Shu、Sifan Song、Sekeun Kim、Qing Xiao、Cheng Chen、Tianming Liu、Xiang Li、Quanzheng Li

计算技术、计算机技术医学研究方法

Pengfei Jin,Peng Shu,Sifan Song,Sekeun Kim,Qing Xiao,Cheng Chen,Tianming Liu,Xiang Li,Quanzheng Li.Beyond Adapter Retrieval: Latent Geometry-Preserving Composition via Sparse Task Projection[EB/OL].(2025-08-06)[2025-08-23].https://arxiv.org/abs/2410.09908.点此复制

评论