SuperLoRA: Parameter-Efficient Unified Adaptation of Large Foundation Models


Xiangyu Chen (University of Kansas), Jing Liu (Mitsubishi Electric Research Labs), Ye Wang (Mitsubishi Electric Research Labs), Pu Perry Wang (Mitsubishi Electric Research Labs), Matthew Brand (Yale University), Guanghui Wang (Toronto Metropolitan University), Toshiaki Koike-Akino (Mitsubishi Electric Research Labs)
The 35th British Machine Vision Conference

Abstract

Low-rank adaptation (LoRA) and its variants are widely employed in fine-tuning large models, including large language models for natural language processing and diffusion models for computer vision. This paper proposes a generalized framework called SuperLoRA that unifies and extends different LoRA variants, which can be realized under different hyper-parameter settings. Introducing new options with grouping, folding, shuffling, projection, and tensor decomposition, SuperLoRA offers high flexibility and demonstrates superior performance, with up to 10-fold gain in parameter efficiency for transfer learning tasks.

Citation

@inproceedings{Chen_2024_BMVC,
author    = {Xiangyu Chen and Jing Liu and Ye Wang and Pu Perry Wang and Matthew Brand and Guanghui Wang and Toshiaki Koike-Akino},
title     = {SuperLoRA: Parameter-Efficient Unified Adaptation of Large Foundation Models},
booktitle = {35th British Machine Vision Conference 2024, {BMVC} 2024, Glasgow, UK, November 25-28, 2024},
publisher = {BMVA},
year      = {2024},
url       = {https://papers.bmvc2024.org/0566.pdf}
}


Copyright © 2024 The British Machine Vision Association and Society for Pattern Recognition
The British Machine Vision Conference is organised by The British Machine Vision Association and Society for Pattern Recognition. The Association is a Company limited by guarantee, No.2543446, and a non-profit-making body, registered in England and Wales as Charity No.1002307 (Registered Office: Dept. of Computer Science, Durham University, South Road, Durham, DH1 3LE, UK).

Imprint | Data Protection