File size: 746 Bytes
820e5d9
 
eb98cc3
 
 
 
 
 
820e5d9
 
 
 
eb98cc3
 
 
820e5d9
eb98cc3
72355f7
eb98cc3
72355f7
 
 
 
 
820e5d9
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
library_name: transformers
datasets:
- mlabonne/orpo-dpo-mix-40k
metrics:
- accuracy
base_model:
- EleutherAI/gpt-neo-125m
---

# Model Card for Model ID

Fine Tuned EleutherAI/gpt-neo-125M using dataset of https://huggingface.co/datasets/mlabonne/orpo-dpo-mix-40k:


<!-- Provide a quick summary of what the model is/does. -->
Passed argument batch_size = auto:4.0. 
Determined largest batch size: 64
Passed argument batch_size = auto:4.0.
Determined largest batch size: 64
|  Tasks  |Version|Filter|n-shot| Metric |   |Value |   |Stderr|
|---------|------:|------|-----:|--------|---|-----:|---|-----:|
|hellaswag|      1|none  |     0|acc     |↑  |0.2868|±  |0.0045|
|         |       |none  |     0|acc_norm|↑  |0.3050|±  |0.0046|