ighoshsubho's picture
Update README.md
d77924c verified
metadata
license: mit

SageAttention 2++ Pre-compiled Wheel

πŸš€ Ultra-fast attention mechanism with 2-3x speedup over FlashAttention2

Pre-compiled Python wheel for high-performance GPU inference, optimized for RTX 4090 and CUDA 12.8+.

πŸš€ Quick Installation

Method 1: Direct Pip Install (Recommended)

wget https://huggingface.co/ModelsLab/Sage_2_plus_plus_build/resolve/main/sageattention-2.2.0-cp311-cp311-linux_x86_64.whl

pip install sageattention-2.2.0-cp311-cp311-linux_x86_64.whl