|国家预印本平台
首页|Revisiting LoRA through the Lens of Parameter Redundancy: Spectral Encoding Helps

Revisiting LoRA through the Lens of Parameter Redundancy: Spectral Encoding Helps

Revisiting LoRA through the Lens of Parameter Redundancy: Spectral Encoding Helps

来源:Arxiv_logoArxiv
英文摘要

Low-Rank Adaptation (LoRA) has emerged as a prominent technique for fine-tuning large foundation models. Despite its successes, the substantial parameter redundancy, which limits the capacity and efficiency of LoRA, has been recognized as a bottleneck. In this work, we systematically investigate the impact of redundancy in fine-tuning LoRA and reveal that reducing density redundancy does not degrade expressiveness. Based on this insight, we introduce \underline{S}pectral-\underline{e}ncoding \underline{L}ow-\underline{R}ank \underline{A}daptation (SeLoRA), which harnesses the robust expressiveness of spectral bases to re-parameterize LoRA from a sparse spectral subspace. Designed with simplicity, SeLoRA enables seamless integration with various LoRA variants for performance boosting, serving as a scalable plug-and-play framework. Extensive experiments substantiate that SeLoRA achieves greater efficiency with fewer parameters, delivering superior performance enhancements over strong baselines on various downstream tasks, including commonsense reasoning, math reasoning, and code generation.

Jiashun Cheng、Aochuan Chen、Nuo Chen、Ziqi Gao、Yuhan Li、Jia Li、Fugee Tsung

计算技术、计算机技术

Jiashun Cheng,Aochuan Chen,Nuo Chen,Ziqi Gao,Yuhan Li,Jia Li,Fugee Tsung.Revisiting LoRA through the Lens of Parameter Redundancy: Spectral Encoding Helps[EB/OL].(2025-06-20)[2025-07-21].https://arxiv.org/abs/2506.16787.点此复制

评论