Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,8 @@ ExpertRamonda-7Bx2_MoE is a Mixure of Experts (MoE) made with the following mode
|
|
24 |
### Open LLM Leaderboard
|
25 |
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
26 |
|------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
|
27 |
-
| mayacinka/ExpertRamonda-7Bx2_MoE | / | 86.87 | 87.51| / | 61.63 |
|
|
|
28 |
|
29 |
### MMLU
|
30 |
| Groups |Version|Filter|n-shot|Metric|Value | |Stderr|
|
|
|
24 |
### Open LLM Leaderboard
|
25 |
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
26 |
|------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
|
27 |
+
| mayacinka/ExpertRamonda-7Bx2_MoE | / | 86.87 | 87.51| / | 61.63 | 81.85 | 72.71|
|
28 |
+
|
29 |
|
30 |
### MMLU
|
31 |
| Groups |Version|Filter|n-shot|Metric|Value | |Stderr|
|