00134aa fe8e9b2 d0e70e9 fe8e9b2
1
2
3
4
5
6
spaces transformers==4.49.0 numpy==1.24.3 flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl autoawq==0.2.1 torch==2.2.0