bergr7f commited on
Commit
2875a07
·
1 Parent(s): 37aa22e

Try fix to flash attn

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -1,2 +1,2 @@
1
  flow-judge[hf]==0.1.0
2
- flash_attn>=2.6.3
 
1
  flow-judge[hf]==0.1.0
2
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl