orionweller commited on
Commit
fe8e9b2
·
verified ·
1 Parent(s): 6906df2

Update requirements.txt

Browse files
Files changed (1) hide show
  1. requirements.txt +2 -2
requirements.txt CHANGED
@@ -1,6 +1,6 @@
1
  spaces
2
  transformers==4.49.0
3
  numpy==1.24.3
4
- flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
  autoawq==0.2.6
6
- torch==2.3.1
 
1
  spaces
2
  transformers==4.49.0
3
  numpy==1.24.3
4
+ flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
  autoawq==0.2.6
6
+ torch==2.2.0