Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
MJ-Bench
/
DDPO-alignment-claude3-opus
like
0
Follow
MJ-Bench-Team
11
Text-to-Image
stable-diffusion
stable-diffusion-diffusers
DDPO
arxiv:
2407.04842
Model card
Files
Files and versions
xet
Community
yichaodu
commited on
Jul 8, 2024
Commit
c92a363
·
verified
·
1 Parent(s):
6ff5c46
Upload optimizer.bin with huggingface_hub
Browse files
Files changed (0)
hide
show