Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MJ-Bench
/
DDPO-alignment-gpt-4o
like
0
Follow
MJ-Bench-Team
9
Text-to-Image
stable-diffusion
stable-diffusion-diffusers
DDPO
arxiv:
2407.04842
Model card
Files
Files and versions
Community
78db471
DDPO-alignment-gpt-4o
2 contributors
History:
2 commits
yichaodu
Upload README.md with huggingface_hub
78db471
verified
8 months ago
.gitattributes
Safe
1.52 kB
initial commit
8 months ago
README.md
Safe
1.08 kB
Upload README.md with huggingface_hub
8 months ago