Is there any way to completely run this model on VRAM.
First, I want to tell,
"THIS IS THE BEST OPEN-SOURCED MODEL I HAVE EVER SEEN in IMAGE GENERATION".
With just four steps, it is creating highly detailed and perfect images.
Thank you so much Black Forest Labs to bring this model.
I have RTX 4090 24 GB graphics card, and 64 GB physical ram. I need following information about VRAM and Ram Usage:
Is it normal for this model to consume 34.3 GB of physical ram out of 64 GB, when there is no image generation.
it is taking so much time to generate 4 steps image.
Are there any settings required to lower physical ram usage and completely run it on graphics card for faster results? I am running comfy UI through run_nvdia_gpu.
These images are taken when image is not generating:
These images are taken when image is generating:
It is taking so much time while generating only 4 steps image. this is result for 4 steps and 20 steps for time.
I mean if the model is 12 billion parameters, it's gonna take a lot of space in RAM/VRAM even just to load the model...