QwenBaseModel / requirements.txt
NandiniLokeshReddy's picture
Update requirements.txt
6497b64 verified
raw
history blame contribute delete
160 Bytes
torch
torchvision
transformers
Pillow
gradio
# Install flash_attn directly from the GitHub repository
git+https://github.com/HazyResearch/flash-attention.git