Error when using .safetensors file
Hello
When I'm trying to load SSD-1B with the safetensors file in python using following code, I get this error.
Traceback:
Traceback (most recent call last):
File "/home/cybertimon/Repositories/AiModels/DiffusionModels/test.py", line 4, in <module>
pipeline = StableDiffusionXLPipeline.from_single_file("XL_SSD-1B.safetensors")
File "/home/cybertimon/.local/lib/python3.10/site-packages/diffusers/loaders.py", line 1924, in from_single_file
pipe = download_from_original_stable_diffusion_ckpt(
File "/home/cybertimon/.local/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 1360, in download_from_original_stable_diffusion_ckpt
converted_unet_checkpoint = convert_ldm_unet_checkpoint(
File "/home/cybertimon/.local/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 505, in convert_ldm_unet_checkpoint
attentions = middle_blocks[1]
KeyError: 1
Code:
from diffusers import StableDiffusionXLPipeline
import torch
pipeline = StableDiffusionXLPipeline.from_single_file("XL_SSD-1B.safetensors")
pipe.enable_xformers_memory_efficient_attention()
prompt = "An astronaut riding a green horse" # Your prompt here
neg_prompt = "ugly, blurry, poor quality" # Negative prompt here
image = pipe(prompt=prompt, negative_prompt=neg_prompt).images[0]
image.save("output.png")
Did you try the below?
pip install git+https://github.com/huggingface/diffusers
Yes just execute this and it upgraded diffusers. I even --force-reinstall ed it. Same error. Does from_single_file work for you?
Seems like you're trying to load in attentions from the middle block into your model. SSD does not have attentions in its middle block.
I don't really know. I'm just want to use the model using the sample code in the readme. Just modified 1 line to accept the .safetensors as shown above. What should I do now? Can't I use the model as .safetensors?
Just use the from_pretrained() function, from_single_file is not supported yet
Closing this discussion, feel free to reopen if the error still persists.