File size: 1,612 Bytes
0f51a4f c70dd37 0f51a4f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 |
---
base_model:
- meta-llama/Llama-3.3-70B-Instruct
---
BF16:
```
arc_challenge
{'alias': 'arc_challenge', 'acc,none': 0.5332764505119454, 'acc_stderr,none': 0.014578995859605814, 'acc_norm,none': 0.5324232081911263, 'acc_norm_stderr,none': 0.014580637569995418}
arc_easy
{'alias': 'arc_easy', 'acc,none': 0.7558922558922558, 'acc_stderr,none': 0.008814322157999389, 'acc_norm,none': 0.6435185185185185, 'acc_norm_stderr,none': 0.009828046544504433}
hellaswag
{'alias': 'hellaswag', 'acc,none': 0.5852419836685919, 'acc_stderr,none': 0.004916733258140271, 'acc_norm,none': 0.6487751443935471, 'acc_norm_stderr,none': 0.00476377498183474}
piqa
{'alias': 'piqa', 'acc,none': 0.8003264417845484, 'acc_stderr,none': 0.009326942154519176, 'acc_norm,none': 0.7899891186071817, 'acc_norm_stderr,none': 0.00950335330581858}
```
This model:
```
arc_challenge
{'alias': 'arc_challenge', 'acc,none': 0.5332764505119454, 'acc_stderr,none': 0.014578995859605806, 'acc_norm,none': 0.5281569965870307, 'acc_norm_stderr,none': 0.014588204105102203}
arc_easy
{'alias': 'arc_easy', 'acc,none': 0.7571548821548821, 'acc_stderr,none': 0.008798836444222032, 'acc_norm,none': 0.6439393939393939, 'acc_norm_stderr,none': 0.00982545460841631}
hellaswag
{'alias': 'hellaswag', 'acc,none': 0.5844453296156145, 'acc_stderr,none': 0.004918102168717931, 'acc_norm,none': 0.650368452499502, 'acc_norm_stderr,none': 0.0047587901724366775}
piqa
{'alias': 'piqa', 'acc,none': 0.8041349292709467, 'acc_stderr,none': 0.009259518041395779, 'acc_norm,none': 0.794885745375408, 'acc_norm_stderr,none': 0.009420971671017919}
``` |