RaSA: Rank-Sharing Low-Rank Adaptation

Bibliographic Details
Title: RaSA: Rank-Sharing Low-Rank Adaptation
Authors: He, Zhiwei, Tu, Zhaopeng, Wang, Xing, Chen, Xingyu, Wang, Zhijie, Xu, Jiahao, Liang, Tian, Jiao, Wenxiang, Zhang, Zhuosheng, Wang, Rui
Publication Year: 2025
Collection: Computer Science
Subject Terms: Computer Science - Computation and Language
More Details: Low-rank adaptation (LoRA) has been prominently employed for parameter-efficient fine-tuning of large language models (LLMs). However, the limited expressive capacity of LoRA, stemming from the low-rank constraint, has been recognized as a bottleneck, particularly in rigorous tasks like code generation and mathematical reasoning. To address this limitation, we introduce Rank-Sharing Low-Rank Adaptation (RaSA), an innovative extension that enhances the expressive capacity of LoRA by leveraging partial rank sharing across layers. By forming a shared rank pool and applying layer-specific weighting, RaSA effectively increases the number of ranks without augmenting parameter overhead. Our theoretically grounded and empirically validated approach demonstrates that RaSA not only maintains the core advantages of LoRA but also significantly boosts performance in challenging code and math tasks. Code, data and scripts are available at: https://github.com/zwhe99/RaSA.
Comment: ICLR 2025
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2503.12576
Accession Number: edsarx.2503.12576
Database: arXiv
More Details
Description not available.