|
--- |
|
license: mit |
|
--- |
|
We released all of our checkpoints used in [LoRA-Flow](https://aclanthology.org/2024.acl-long.695.pdf) which has been accepted to ACL 2024 main conference. |
|
# Summary |
|
In this repo, we release LoRA and the gate of 7B models trained in our paper in HuggingFace format. |
|
# Method |
|
The following picture has shown our proposed method, we use layer-wise fusion gates to facilitate dynamic LoRA fusion, which project input hidden states of each layer into fusion weights. |
|
![1.jpg](https://cdn-uploads.huggingface.co/production/uploads/64d99f6cd7e30889c6c477b4/ifiu1FTHilrmUkD4FKkgV.jpeg) |
|
# Citation |
|
if you find our repo is helpful, please cite the following |
|
[LoRA-Flow: Dynamic LoRA Fusion for Large Language Models in Generative Tasks](https://aclanthology.org/2024.acl-long.695) |