Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
davidkim205
/
komt-mistral-7b-v1-dpo
like
8
Text Generation
PEFT
Safetensors
PyTorch
English
Korean
facebook
meta
llama
llama-2
llama-2-chat
arxiv:
2308.06502
arxiv:
2308.06259
Model card
Files
Files and versions
Community
2
Use this model
New discussion
New pull request
Resources
PR & discussions documentation
Code of Conduct
Hub documentation
All
Discussions
Pull requests
View closed (1)
Where can I download the dpo model?
1
#2 opened about 1 year ago by
ihave1