Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
SalimBou5
/
dpo_model
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Use this model
main
dpo_model
/
.gitattributes
Commit History
Upload folder using huggingface_hub
4a73bef
verified
SalimBou5
commited on
Jun 3
initial commit
2f0f508
verified
SalimBou5
commited on
Jun 3