|国家预印本平台
首页|Joint Tensor-Train Parameterization for Efficient and Expressive Low-Rank Adaptation

Joint Tensor-Train Parameterization for Efficient and Expressive Low-Rank Adaptation

Joint Tensor-Train Parameterization for Efficient and Expressive Low-Rank Adaptation

来源:Arxiv_logoArxiv
英文摘要

Low-Rank Adaptation (LoRA) is widely recognized for its parameter-efficient fine-tuning of large-scale neural models. However, standard LoRA independently optimizes low-rank matrices, which inherently limits its expressivity and generalization capabilities. While classical tensor-train (TT) decomposition can be separately employed on individual LoRA matrices, this work demonstrates that the classical TT-based approach neither significantly improves parameter efficiency nor achieves substantial performance gains. This paper proposes TensorGuide, a novel tensor-train-guided adaptation framework to overcome these limitations. TensorGuide generates two correlated low-rank LoRA matrices through a unified TT structure driven by controlled Gaussian noise. The resulting joint TT representation inherently provides structured, low-rank adaptations, significantly enhancing expressivity, generalization, and parameter efficiency without increasing the number of trainable parameters. Theoretically, we justify these improvements through neural tangent kernel analyses, demonstrating superior optimization dynamics and enhanced generalization. Extensive experiments on quantum dot classification and GPT-2 fine-tuning benchmarks demonstrate that TensorGuide-based LoRA consistently outperforms standard LoRA and TT-LoRA, achieving improved accuracy and scalability with fewer parameters.

Jun Qi、Chen-Yu Liu、Sabato Marco Siniscalchi、Chao-Han Huck Yang、Min-Hsiu Hsieh

自然科学研究方法信息科学、信息技术数学

Jun Qi,Chen-Yu Liu,Sabato Marco Siniscalchi,Chao-Han Huck Yang,Min-Hsiu Hsieh.Joint Tensor-Train Parameterization for Efficient and Expressive Low-Rank Adaptation[EB/OL].(2025-06-19)[2025-07-09].https://arxiv.org/abs/2506.16456.点此复制

评论