File size: 690 Bytes
5ff3ddb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged)
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 42.74 |
| ARC (25-shot) | 54.35 |
| HellaSwag (10-shot) | 78.06 |
| MMLU (5-shot) | 45.35 |
| TruthfulQA (0-shot) | 37.11 |
| Winogrande (5-shot) | 73.4 |
| GSM8K (5-shot) | 4.62 |
| DROP (3-shot) | 6.28 |
|