LoRA-Flow / README.md
Bowen232's picture
Update README.md
762668d verified
|
raw
history blame
794 Bytes
metadata
license: mit

We released all of our checkpoints used in LoRA-Flow which has been accepted to ACL 2024 main conference.

Summary

In this repo, we release LoRA and the gate of 7B models trained in our paper in HuggingFace format.

Method

The following picture has shown our proposed method, we use layer-wise fusion gates to facilitate dynamic LoRA fusion, which project input hidden states of each layer into fusion weights. 1.jpg

Citation

if you find our repo is helpful, please cite the following LoRA-Flow: Dynamic LoRA Fusion for Large Language Models in Generative Tasks