Original model is here.
Notice
This is an experimental conversion in Spaces using a homebrew script. serverless Inference API does not currently support torch float8_e4m3fn, so it does not work. There are still many bugs around the FLUX.1 conversion part of Diffusers, and I have not been able to confirm if the conversion is working properly. Please consider this as a test run only.
- Downloads last month
- 20
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.