Commit
•
d96edf4
1
Parent(s):
29cace5
Update requirements.txt
Browse files- requirements.txt +2 -0
requirements.txt
CHANGED
@@ -20,6 +20,8 @@ safetensors==0.4.3
|
|
20 |
|
21 |
moviepy
|
22 |
|
|
|
|
|
23 |
# Install flash attention v2 for acceleration (requires CUDA 11.8 or above)
|
24 |
# python -m pip install git+https://github.com/Dao-AILab/[email protected]
|
25 |
#flash-attn==2.5.9.post1
|
|
|
20 |
|
21 |
moviepy
|
22 |
|
23 |
+
git+https://github.com/jbilcke-hf/varnish.git@main
|
24 |
+
|
25 |
# Install flash attention v2 for acceleration (requires CUDA 11.8 or above)
|
26 |
# python -m pip install git+https://github.com/Dao-AILab/[email protected]
|
27 |
#flash-attn==2.5.9.post1
|