This is a smaller checkpoint for flux1-dev that will work better for ComfyUI users with less VRAM (under 24gb).
The two text encoders used by Flux are already included in this one safetensor.
Use it with the Load Checkpoint
node in ComfyUI.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.