Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
We-Want-GPU
's Collections
SFT LLM
LLM Dataset
DPO LLM
DPO LLM
updated
Dec 31, 2023
Upvote
-
We-Want-GPU/Yi-Ko-6B-DPO-v2
Text Generation
•
Updated
Dec 27, 2023
•
2.51k
Upvote
-
Share collection
View history
Collection guide
Browse collections