|
Model Name,Overall Accuracy,Correct Predictions,Total Questions,Timestamp,Team Name |
|
Gemma-2-9b-it,69.1,12150,16186,2024-12-23 00:00:00,Gemma-2-9b-it |
|
Qwen2.5-7B-Instruct,68.4,6496,9497 |
|
Ministral-8B-Instruct-2410,63.6, 6040,9497 |
|
Qwen2.5-3B-Instruct,60.6,5755,9497 |
|
Llama-3.1-8B-Instruct,57.1,5423,9497 |
|
Phi-3.5-mini-instruct,54.8,5204,9497 |
|
Llama-3.2-3B-Instruct,42,3989,9497 |
|
Qwen2.5-1.5B-Instruct,43.0,4084,9497 |
|
Granite-3.1-8b-instruct,57.7,5480,9497 |
|
Internlm2_5-7b-chat,58.6,5566,9497 |
|
Yi-1.5-9B-Chat,67.7,6430,9497 |
|
Gemma-2-2b-it,31.2,2964,9497 |
|
Llama-3.2-1B-Instruct,31.1,2954,9497 |
|
Olmo-2-1124-7B-Instruct,42.9,4075,9497 |
|
Falcon3-7B-Instruct,46.8,4445,9497 |
|
Falcon3-10B-Instruct,49.1,4664,9497 |
|
Yi-1.5-6B-Chat,54.7,5195,9497 |
|
wr,28.9,4677,16186,2025-03-24 20:56:39,rrrr |
|
|