This repository provides the int8 quantized AoT compiled binary of Flux.1-Dev.

Follow this gist for details on how it was obtained and how to perform inference.

Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for sayakpaul/flux.1-dev-int8-aot-compiled

Finetuned
(364)
this model