Original model is here. This model created by calculater.

Notice

This is an experimental conversion in Spaces using a homebrew script. serverless Inference API does not currently support torch float8_e4m3fn, so it does not work. There are still many bugs around the FLUX.1 conversion part of Diffusers, and I have not been able to confirm if the conversion is working properly. Please consider this as a test run only.

Downloads last month
4
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Space using John6666/copycat-flux-test-fp8-v11-fp8-flux 1