soonchang commited on
Commit
fef0e98
·
verified ·
1 Parent(s): e765ce2

Update README

Browse files
Files changed (1) hide show
  1. README.md +65 -0
README.md ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ task_categories:
3
+ - question-answering
4
+ language:
5
+ - ms
6
+ tags:
7
+ - knowledge
8
+ pretty_name: MalayMMLU
9
+ size_categories:
10
+ - 10K<n<100K
11
+ ---
12
+ # MalayMMLU
13
+
14
+ Released on September 27, 2024
15
+
16
+ <h4 align="center">
17
+ <p>
18
+ <p align="center" style="display: flex; flex-direction: row; justify-content: center; align-items: center">
19
+ 📄 <a href="https://openreview.net/pdf?id=VAXwQqkp5e" target="_blank" style="margin-right: 15px; margin-left: 10px">Paper</a> •
20
+ 🤗 <a href="https://github.com/UMxYTL-AI-Labs/MalayMMLU" target="_blank" style="margin-left: 10px">Github</a>
21
+ </p>
22
+ </h4>
23
+
24
+ # Introduction
25
+
26
+ MalayMMLU is the first multitask language understanding (MLU) for Malay Language. The benchmark comprises 24,213 questions spanning both primary (Year 1-6) and secondary (Form 1-5) education levels in Malaysia, encompassing 5 broad topics that further divide into 22 subjects.
27
+ <p align="center">
28
+ <img src="imgs/MalayMMLU.png" width="250" >
29
+ </p>
30
+
31
+ | **Category** | **Subjects** |
32
+ |----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
33
+ | **STEM** | Computer Science (Secondary), Biology (Secondary), Chemistry (Secondary), Computer Literacy (Secondary), Mathematics (Primary, Secondary), Additional Mathematics (Secondary), Design and Technology (Primary, Secondary), Core Science (Primary, Secondary), Information and Communication Technology (Primary), Automotive Technology (Secondary) |
34
+ | **Language** | Malay Language (Primary, Secondary) |
35
+ | **Social science** | Geography (Secondary), Local Studies (Primary), History (Primary, Secondary) |
36
+ | **Others** | Life Skills (Primary, Secondary), Principles of Accounting (Secondary), Economics (Secondary), Business (Secondary), Agriculture (Secondary) |
37
+ | **Humanities** | Quran and Sunnah (Secondary), Islam (Primary, Secondary), Sports Science Knowledge (Secondary) |
38
+
39
+ # Result
40
+
41
+ ### Zero-shot results of LLMs on MalayMMLU (First token accuracy)
42
+
43
+ | **Model** | **Language** | **Humanities** | **STEM** | **Social Science** | **Others** | **Average** |
44
+ |-------------------------|-------------------|---------------------|---------------|-------------------------|-----------------|------------------|
45
+ | Random | 38.01 | 42.09 | 36.31 | 36.01 | 38.07 | 38.02 |
46
+ | GPT-4 | **82.90** | **83.91** | **78.80** | **77.29** | **77.33** | **80.11** |
47
+ | GPT-3.5 | 69.62 | 71.01 | 67.17 | 66.70 | 63.73 | 67.78 |
48
+ | [LLaMA-3 (8B)](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) | 63.93 | 66.21 | 62.26 | 62.97 | 61.38 | 63.46 |
49
+ | [LLaMA-2 (13B)](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) | 45.58 | 50.72 | 44.13 | 44.55 | 40.87 | 45.26 |
50
+ | [LLaMA-2 (7B)](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) | 47.47 | 52.74 | 48.71 | 50.72 | 48.19 | 49.61 |
51
+ | [Mistral-v0.3 (7B)](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3) | 56.97 | 59.29 | 57.14 | 58.28 | 56.56 | 57.71 |
52
+ | [Mistral-v0.2 (7B)](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) | 56.23 | 59.86 | 57.10 | 56.65 | 55.22 | 56.92 |
53
+ | [Sailor (7B)](https://huggingface.co/sail/Sailor-7B-Chat) | 74.54 | 68.62 | 62.79 | 64.69 | 63.61 | 67.58 |
54
+ | [SeaLLM-v2.5 (7B)](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5) | 69.75 | 67.94 | 65.29 | 62.66 | 63.61 | 65.89 |
55
+ | [Phi-3 (14B)](https://huggingface.co/microsoft/Phi-3-medium-4k-instruct) | 60.07 | 58.89 | 60.91 | 58.73 | 55.24 | 58.72 |
56
+ | [Phi-3 (3.8B)](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) | 52.24 | 55.52 | 54.81 | 53.70 | 51.74 | 53.43 |
57
+ | [GLM-4 (9B)](https://huggingface.co/THUDM/glm-4-9b-chat) | 58.51 | 60.48 | 56.32 | 55.04 | 53.97 | 56.87 |
58
+ | [Qwen-1.5 (7B)](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) | 60.13 | 59.14 | 58.62 | 54.26 | 54.67 | 57.18 |
59
+ | [Qwen-1.5 (4B)](https://huggingface.co/Qwen/Qwen1.5-4B-Chat) | 48.39 | 52.01 | 51.37 | 50.00 | 49.10 | 49.93 |
60
+ | [Qwen-1.5 (1.8B)](https://huggingface.co/Qwen/Qwen1.5-1.8B-Chat) | 42.70 | 43.37 | 43.68 | 43.12 | 44.42 | 43.34 |
61
+ | [Gemma (7B)](https://huggingface.co/google/gemma-7b-it) | 45.53 | 50.92 | 46.13 | 47.33 | 46.27 | 47.21 |
62
+ | [Gemma (2B)](https://huggingface.co/google/gemma-2b-it) | 46.50 | 51.15 | 49.20 | 48.06 | 48.79 | 48.46 |
63
+ | [Baichuan-2 (7B)](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat) | 40.41 | 47.35 | 44.37 | 46.33 | 43.54 | 44.30 |
64
+ | [Komodo (7B)](https://huggingface.co/Yellow-AI-NLP/komodo-7b-base) | 43.62 | 45.53 | 39.34 | 39.75 | 39.48 | 41.72 |
65
+ | [MaLLaM-v2 (5B)](https://huggingface.co/mesolitica/mallam-5b-20k-instructions-v2)| 42.56 | 46.42 | 42.16 | 40.81 | 38.81 | 42.07 |