oweller2 commited on
Commit
2b8a625
·
1 Parent(s): 656cbe2

update reqs

Browse files
Files changed (1) hide show
  1. requirements.txt +3 -3
requirements.txt CHANGED
@@ -1,6 +1,6 @@
1
  spaces
2
  transformers==4.49.0
3
  numpy==1.24.3
4
- flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
- autoawq==0.2.1
6
- torch==2.2.0
 
1
  spaces
2
  transformers==4.49.0
3
  numpy==1.24.3
4
+ flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
5
+ autoawq==0.2.7
6
+ torch==2.4.0