Upload folder using huggingface_hub
Browse files- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo-time.pdf +0 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo-time.png +3 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo-time.svg +0 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.csv +2 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.md +3 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.pdf +0 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.png +3 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.svg +0 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo_runtimes.csv +2 -0
- images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo_runtimes.md +3 -0
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo-time.pdf
ADDED
Binary file (385 kB). View file
|
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo-time.png
ADDED
![]() |
Git LFS Details
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo-time.svg
ADDED
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.csv
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
,"huggingface/trl/gpt2 ({'tag': ['v0.7.4-90-g0022af6', 'pr-1176']})"
|
2 |
+
dpo_anthropic_hh,nan ± nan
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.md
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
| | huggingface/trl/gpt2 ({'tag': ['v0.7.4-90-g0022af6', 'pr-1176']}) |
|
2 |
+
|:-----------------|:--------------------------------------------------------------------|
|
3 |
+
| dpo_anthropic_hh | nan ± nan |
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.pdf
ADDED
Binary file (389 kB). View file
|
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.png
ADDED
![]() |
Git LFS Details
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo.svg
ADDED
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo_runtimes.csv
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
,"huggingface/trl/gpt2 ({'tag': ['v0.7.4-90-g0022af6', 'pr-1176']})"
|
2 |
+
dpo_anthropic_hh,1.0212458267476823
|
images/benchmark/v0.7.4-90-g0022af6_pr-1176/dpo_runtimes.md
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
| | huggingface/trl/gpt2 ({'tag': ['v0.7.4-90-g0022af6', 'pr-1176']}) |
|
2 |
+
|:-----------------|--------------------------------------------------------------------:|
|
3 |
+
| dpo_anthropic_hh | 1.02125 |
|