mayacinka commited on
Commit
ec52bba
·
verified ·
1 Parent(s): 1587166

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -0
README.md CHANGED
@@ -19,6 +19,22 @@ ExpertRamonda-7Bx2_MoE is a Mixure of Experts (MoE) made with the following mode
19
  * [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B)
20
  * [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6)
21
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
22
  ## 🧩 Configuration
23
 
24
  ```yaml
 
19
  * [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B)
20
  * [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6)
21
 
22
+ # 🏆 Benchmarks
23
+
24
+ ### Open LLM Leaderboard
25
+ | Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
26
+ |------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
27
+ | mayacinka/ExpertRamonda-7Bx2_MoE | / | 86.87 | 87.51| / | 61.63 | / | /|
28
+
29
+ ### MMLU
30
+ | Groups |Version|Filter|n-shot|Metric|Value | |Stderr|
31
+ |------------------|-------|------|------|------|-----:|---|-----:|
32
+ |mmlu |N/A |none | 0|acc |0.6163|± |0.0039|
33
+ | - humanities |N/A |none |None |acc |0.5719|± |0.0067|
34
+ | - other |N/A |none |None |acc |0.6936|± |0.0079|
35
+ | - social_sciences|N/A |none |None |acc |0.7121|± |0.0080|
36
+ | - stem |N/A |none |None |acc |0.5128|± |0.0085|
37
+
38
  ## 🧩 Configuration
39
 
40
  ```yaml