yann23 commited on
Commit
544b8ef
·
verified ·
1 Parent(s): a46b753

Upload iteration_7/dataset_dpo.parquet with huggingface_hub

Browse files
Files changed (1) hide show
  1. iteration_7/dataset_dpo.parquet +3 -0
iteration_7/dataset_dpo.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aabe532e4d41ce094abb95175649e86adb077a43e216b2743c118c4f036fb6d8
3
+ size 95347