Ji-Xiang
's Collections
DPO datasets
updated
Viewer
•
Updated
•
7.5k
•
360
•
157
argilla/distilabel-capybara-dpo-7k-binarized
Viewer
•
Updated
•
7.56k
•
2.01k
•
180
llamafactory/DPO-En-Zh-20k
Viewer
•
Updated
•
20k
•
217
•
91
argilla/distilabel-intel-orca-dpo-pairs
Viewer
•
Updated
•
12.9k
•
553
•
170
argilla/ultrafeedback-binarized-preferences-cleaned
Viewer
•
Updated
•
60.9k
•
5k
•
130
argilla/distilabel-math-preference-dpo
Viewer
•
Updated
•
2.42k
•
132
•
80
M4-ai/prm_dpo_pairs_cleaned
Viewer
•
Updated
•
7.99k
•
74
•
11
jondurbin/truthy-dpo-v0.1
Viewer
•
Updated
•
1.02k
•
252
•
132
YeungNLP/ultrafeedback_binarized
Viewer
•
Updated
•
63.1k
•
21
•
1
shibing624/DPO-En-Zh-20k-Preference
Viewer
•
Updated
•
20k
•
53
•
16
Preview
•
Updated
•
94
•
6
mlabonne/orpo-dpo-mix-40k
Viewer
•
Updated
•
44.2k
•
941
•
272
Viewer
•
Updated
•
15.3k
•
69
•
18
jondurbin/gutenberg-dpo-v0.1
Viewer
•
Updated
•
918
•
1.07k
•
128
CyberNative/Code_Vulnerability_Security_DPO
Viewer
•
Updated
•
4.66k
•
199
•
74
mlabonne/orpo-dpo-mix-40k-flat
Viewer
•
Updated
•
44.2k
•
91
•
11
selimc/orpo-dpo-mix-TR-20k
Viewer
•
Updated
•
19.9k
•
75
•
4
efederici/alpaca-vs-alpaca-orpo-dpo
Viewer
•
Updated
•
49.2k
•
172
•
7
Viewer
•
Updated
•
2.42k
•
56
•
7
allenai/llama-3.1-tulu-3-8b-preference-mixture
Preview
•
Updated
•
714
•
10
allenai/llama-3.1-tulu-3-70b-preference-mixture
Viewer
•
Updated
•
334k
•
355
•
16
HuggingFaceH4/ultrafeedback_binarized
Viewer
•
Updated
•
187k
•
8.49k
•
266
Viewer
•
Updated
•
33.8k
•
1.51k
•
153