Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,68 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
tags:
|
5 |
+
- pytorch
|
6 |
+
- causal-lm
|
7 |
+
- pythia
|
8 |
+
license: apache-2.0
|
9 |
+
datasets:
|
10 |
+
- Anthropic/hh-rlhf
|
11 |
+
---
|
12 |
+
|
13 |
+
[Pythia-1b](https://huggingface.co/EleutherAI/pythia-1b) DPO finetuned using original DPO code with the helpful subset of [Anthropic-hh-rlhf dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf) for 1 epoch.
|
14 |
+
|
15 |
+
Checkpoints are also uploaded.
|
16 |
+
|
17 |
+
Fully reproducible finetuning code is available on [GitHub](https://github.com/lomahony/trlx-pythia/tree/main)
|
18 |
+
|
19 |
+
[wandb log](https://wandb.ai/lauraomahony999/pythia-dpo/runs/0mhjakjz)
|
20 |
+
|
21 |
+
See [Pythia-1b](https://huggingface.co/EleutherAI/pythia-1b) for model details [(paper)](https://arxiv.org/abs/2101.00027).
|
22 |
+
|
23 |
+
hf (pretrained=lomahony/pythia-1b-helpful-dpo), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: 16
|
24 |
+
| Tasks |Version|Filter|n-shot| Metric | Value | |Stderr|
|
25 |
+
|--------------|------:|------|-----:|---------------|------:|---|------|
|
26 |
+
|arc_challenge | 1|none | 0|acc | 0.2602|± |0.0128|
|
27 |
+
| | |none | 0|acc_norm | 0.2867|± |0.0132|
|
28 |
+
|arc_easy | 1|none | 0|acc | 0.5859|± |0.0101|
|
29 |
+
| | |none | 0|acc_norm | 0.5008|± |0.0103|
|
30 |
+
|boolq | 2|none | 0|acc | 0.6205|± |0.0085|
|
31 |
+
|hellaswag | 1|none | 0|acc | 0.3895|± |0.0049|
|
32 |
+
| | |none | 0|acc_norm | 0.4872|± |0.0050|
|
33 |
+
|lambada_openai| 1|none | 0|perplexity | 6.9417|± |0.2019|
|
34 |
+
| | |none | 0|acc | 0.5550|± |0.0069|
|
35 |
+
|openbookqa | 1|none | 0|acc | 0.2140|± |0.0184|
|
36 |
+
| | |none | 0|acc_norm | 0.3220|± |0.0209|
|
37 |
+
|piqa | 1|none | 0|acc | 0.7193|± |0.0105|
|
38 |
+
| | |none | 0|acc_norm | 0.7008|± |0.0107|
|
39 |
+
|sciq | 1|none | 0|acc | 0.8450|± |0.0115|
|
40 |
+
| | |none | 0|acc_norm | 0.7600|± |0.0135|
|
41 |
+
|wikitext | 2|none | 0|word_perplexity|17.2316|± |N/A |
|
42 |
+
| | |none | 0|byte_perplexity| 1.7029|± |N/A |
|
43 |
+
| | |none | 0|bits_per_byte | 0.7680|± |N/A |
|
44 |
+
|winogrande | 1|none | 0|acc | 0.5367|± |0.0140|
|
45 |
+
|
46 |
+
hf (pretrained=lomahony/pythia-1b-helpful-dpo), gen_kwargs: (None), limit: None, num_fewshot: 5, batch_size: 16
|
47 |
+
| Tasks |Version|Filter|n-shot| Metric | Value | |Stderr|
|
48 |
+
|--------------|------:|------|-----:|---------------|------:|---|------|
|
49 |
+
|arc_challenge | 1|none | 5|acc | 0.2662|± |0.0129|
|
50 |
+
| | |none | 5|acc_norm | 0.3003|± |0.0134|
|
51 |
+
|arc_easy | 1|none | 5|acc | 0.6103|± |0.0100|
|
52 |
+
| | |none | 5|acc_norm | 0.5892|± |0.0101|
|
53 |
+
|boolq | 2|none | 5|acc | 0.6284|± |0.0085|
|
54 |
+
|hellaswag | 1|none | 5|acc | 0.3841|± |0.0049|
|
55 |
+
| | |none | 5|acc_norm | 0.4845|± |0.0050|
|
56 |
+
|lambada_openai| 1|none | 5|perplexity | 9.6301|± |0.2809|
|
57 |
+
| | |none | 5|acc | 0.4865|± |0.0070|
|
58 |
+
|openbookqa | 1|none | 5|acc | 0.2020|± |0.0180|
|
59 |
+
| | |none | 5|acc_norm | 0.3300|± |0.0210|
|
60 |
+
|piqa | 1|none | 5|acc | 0.7122|± |0.0106|
|
61 |
+
| | |none | 5|acc_norm | 0.7046|± |0.0106|
|
62 |
+
|sciq | 1|none | 5|acc | 0.9030|± |0.0094|
|
63 |
+
| | |none | 5|acc_norm | 0.8980|± |0.0096|
|
64 |
+
|wikitext | 2|none | 5|word_perplexity|17.2316|± |N/A |
|
65 |
+
| | |none | 5|byte_perplexity| 1.7029|± |N/A |
|
66 |
+
| | |none | 5|bits_per_byte | 0.7680|± |N/A |
|
67 |
+
|winogrande | 1|none | 5|acc | 0.5296|± |0.0140|
|
68 |
+
|