File size: 368 Bytes
07ebfb1
 
466aa32
c0b1c76
f694806
e9603c3
1
2
3
4
5
6
git+https://github.com/huggingface/transformers
torch
pytube
spaces
#https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
#https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl