chong.zhang commited on
Commit
a0e6dfa
·
1 Parent(s): 461c621
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -27,4 +27,4 @@ WeTextProcessing==1.0.3
27
  transformers
28
  accelerate
29
  huggingface-hub==0.25.2
30
- flash_attn-2.6.3+cu118torch2.0cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
 
27
  transformers
28
  accelerate
29
  huggingface-hub==0.25.2
30
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.0cxx11abiFALSE-cp310-cp310-linux_x86_64.whl