Spaces:
Sleeping
Sleeping
Update requirements.txt
Browse files- requirements.txt +5 -2
requirements.txt
CHANGED
@@ -3,7 +3,8 @@ pip==24.1.2
|
|
3 |
torch==2.2.0
|
4 |
triton
|
5 |
|
6 |
-
|
|
|
7 |
|
8 |
transformers
|
9 |
accelerate
|
@@ -12,4 +13,6 @@ einops
|
|
12 |
xformers
|
13 |
|
14 |
numpy
|
15 |
-
packaging
|
|
|
|
|
|
3 |
torch==2.2.0
|
4 |
triton
|
5 |
|
6 |
+
--find-links https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9/
|
7 |
+
flash-attn==2.5.9.post1+cu122torch2.2cxx11abiTRUE
|
8 |
|
9 |
transformers
|
10 |
accelerate
|
|
|
13 |
xformers
|
14 |
|
15 |
numpy
|
16 |
+
packaging
|
17 |
+
|
18 |
+
git+https://github.com/unslothai/unsloth.git#egg=unsloth[cu121-ampere-torch220]
|