eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
52 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
26 values
Hub ❤️
int64
0
5.9k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
424 values
Submission Date
stringclasses
169 values
Generation
int64
0
10
Base Model
stringlengths
4
102
lalainy_ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1
7865b6f386969b831e9c1754914463154fecbda2
3.598134
apache-2.0
0
0
true
false
false
false
0.525273
0.143708
14.370758
0.303195
2.929449
0
0
0.234899
0
0.364604
2.942187
0.112118
1.34641
false
false
2024-11-09
2024-11-12
0
lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1
lalainy_ECE-PRYMMAL-YL-1B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-1B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3
eef4293be744aef0524f00a7657e915a6601a459
16.271668
apache-2.0
0
1
true
false
false
false
0.593107
0.325009
32.500875
0.422455
18.228655
0.086858
8.685801
0.294463
5.928412
0.421281
10.826823
0.293135
21.459441
false
false
2024-11-12
2024-11-12
0
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3
lalainy_ECE-PRYMMAL-YL-1B-SLERP-V4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-1B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4
dfa5e42b6f4f83cacc3b9e7d0ff05fec7f941835
16.148858
apache-2.0
0
1
true
false
false
false
0.611871
0.332353
33.23526
0.417074
17.411752
0.083082
8.308157
0.286074
4.809843
0.430615
12.09349
0.289312
21.034648
false
false
2024-11-12
2024-11-12
0
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4
lalainy_ECE-PRYMMAL-YL-6B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-6B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1
56789ff5fcc863460fce652ebe6ed6bb5a4bd30c
19.860808
apache-2.0
0
6
true
false
false
false
0.50158
0.326407
32.640727
0.462937
24.515259
0.116314
11.63142
0.288591
5.145414
0.486396
20.632813
0.321393
24.599217
false
false
2024-11-08
2024-11-08
0
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1
lalainy_ECE-PRYMMAL-YL-6B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-6B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2
18d282d0206ae8f878a9cfa80ce4eaf042056569
19.835413
apache-2.0
0
6
true
false
false
false
0.498131
0.324884
32.488353
0.462937
24.515259
0.116314
11.63142
0.288591
5.145414
0.486396
20.632813
0.321393
24.599217
false
false
2024-11-09
2024-11-09
0
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2
langgptai_Qwen-las-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/langgptai/Qwen-las-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">langgptai/Qwen-las-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/langgptai__Qwen-las-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
langgptai/Qwen-las-v0.1
a7a4d4945d28bac955554c9abd2f74a71ebbf22f
11.394004
other
0
7
true
false
false
true
1.798096
0.330104
33.010412
0.389255
14.69864
0.022659
2.265861
0.246644
0
0.370094
3.661719
0.232547
14.727394
false
false
2024-05-26
2024-06-27
1
Qwen/Qwen1.5-4B-Chat
langgptai_qwen1.5-7b-chat-sa-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/langgptai/qwen1.5-7b-chat-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">langgptai/qwen1.5-7b-chat-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/langgptai__qwen1.5-7b-chat-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
langgptai/qwen1.5-7b-chat-sa-v0.1
5f4f5e69ac7f1d508f8369e977de208b4803444b
16.580171
other
0
15
true
false
false
true
1.464321
0.426774
42.677429
0.432527
20.302342
0.030211
3.021148
0.312081
8.277405
0.355146
3.059896
0.299285
22.142804
false
false
2024-05-30
2024-06-27
1
Qwen/Qwen1.5-7B-Chat
leafspark_Llama-3.1-8B-MultiReflection-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/leafspark/Llama-3.1-8B-MultiReflection-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">leafspark/Llama-3.1-8B-MultiReflection-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/leafspark__Llama-3.1-8B-MultiReflection-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
leafspark/Llama-3.1-8B-MultiReflection-Instruct
b748441154efdbd7690d773b0194197bfc136ed0
26.311881
llama3.1
5
8
true
false
false
true
0.848447
0.712538
71.253829
0.500909
28.448045
0.136707
13.670695
0.292785
5.704698
0.368198
8.52474
0.372424
30.269282
false
false
2024-09-15
2024-09-15
1
leafspark/Llama-3.1-8B-MultiReflection-Instruct (Merge)
lemon07r_Gemma-2-Ataraxy-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-9B
fb22193268c7a6c3b4598255999ce2de3af8c256
22.465284
gemma
66
10
true
false
false
false
2.910688
0.300877
30.087723
0.59313
42.031991
0.010574
1.057402
0.334732
11.297539
0.442427
14.470052
0.422623
35.847001
true
false
2024-08-14
2024-08-27
1
lemon07r/Gemma-2-Ataraxy-9B (Merge)
lemon07r_Gemma-2-Ataraxy-Advanced-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-Advanced-9B
960654f5780f0b458367a6b591ad8440892c2aad
25.109694
2
10
false
false
false
false
3.227277
0.551596
55.159643
0.588907
41.161438
0.003776
0.377644
0.33557
11.409396
0.376073
6.509115
0.424368
36.040928
false
false
2024-09-30
2024-09-30
1
lemon07r/Gemma-2-Ataraxy-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-Remix-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-Remix-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-Remix-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-Remix-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-Remix-9B
f917a9be9f86d58fe122d58ba84cf4b08e4a975e
29.261671
3
10
false
false
false
false
2.157441
0.708342
70.834164
0.589202
41.592313
0.015861
1.586103
0.338926
11.856823
0.437188
13.715104
0.42387
35.98552
false
false
2024-09-30
2024-09-30
1
lemon07r/Gemma-2-Ataraxy-Remix-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v2-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v2-9B
77aca48ac25eb2cbe8c0751a4ef77e5face34d80
19.17383
12
10
false
false
false
false
2.996242
0.213624
21.362429
0.576584
39.796854
0.009063
0.906344
0.342282
12.304251
0.348385
4.88151
0.422124
35.791593
false
false
2024-09-28
2024-09-28
1
lemon07r/Gemma-2-Ataraxy-v2-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v2a-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v2a-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v2a-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v2a-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v2a-9B
899fb093d80569fc919f53217e3acf031dde89a5
15.019361
1
10
false
false
false
false
2.981396
0.159469
15.94691
0.518249
31.198528
0
0
0.339765
11.96868
0.316479
3.059896
0.351479
27.942154
false
false
2024-09-29
2024-09-29
1
lemon07r/Gemma-2-Ataraxy-v2a-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v2f-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v2f-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v2f-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v2f-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v2f-9B
44da9d6a9bc7be5a9af24fb0951047849d5f717d
18.765944
1
10
false
false
false
false
3.398596
0.379114
37.911408
0.519285
31.421336
0
0
0.338926
11.856823
0.323146
3.593229
0.350316
27.812869
false
false
2024-09-30
2024-09-30
1
lemon07r/Gemma-2-Ataraxy-v2f-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3-Advanced-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B
318afe2b44a150780e44483a0f90a499e81f946f
28.333877
3
10
false
false
false
false
2.787996
0.660182
66.018165
0.593515
42.210472
0.001511
0.151057
0.336409
11.521253
0.444969
14.58776
0.419631
35.514554
false
false
2024-10-09
2024-10-09
1
lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3b-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3b-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3b-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3b-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3b-9B
de8bbacddabf22dad89658d3b3d358b3eccbd59c
29.016329
1
9
false
false
false
false
2.303728
0.680914
68.091442
0.59077
41.623985
0.024924
2.492447
0.333054
11.073826
0.448875
15.209375
0.420462
35.6069
false
false
2024-10-08
2024-10-08
1
lemon07r/Gemma-2-Ataraxy-v3b-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3i-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3i-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3i-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3i-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3i-9B
8bd1ce81b6f42ebeebd9957b605c7313eedbe0a8
21.293828
3
9
false
false
false
false
3.47551
0.420305
42.030479
0.562575
38.238825
0.001511
0.151057
0.32802
10.402685
0.318062
1.757812
0.416639
35.182107
false
false
2024-10-06
2024-10-06
1
lemon07r/Gemma-2-Ataraxy-v3i-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3j-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3j-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3j-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3j-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3j-9B
7ad4a1bf604f37bd82f3470dbc24870896d7287d
21.180096
1
9
false
false
false
false
3.413822
0.416933
41.693263
0.563229
38.166569
0.000755
0.075529
0.32802
10.402685
0.318031
1.920573
0.413398
34.821956
false
false
2024-10-09
2024-10-09
1
lemon07r/Gemma-2-Ataraxy-v3j-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4-Advanced-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B
bc9edb78753fc60a22268cd91e93e43dd9fbc648
30.935503
5
10
false
false
false
false
2.329564
0.701547
70.154745
0.602363
43.181897
0.067221
6.722054
0.338926
11.856823
0.458052
16.289844
0.436669
37.407654
false
false
2024-10-13
2024-10-14
1
lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4a-Advanced-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4a-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B
78dca140ec1b704233c932706fc9640404433cc5
30.164008
3
10
false
false
false
false
2.289618
0.713512
71.351237
0.598839
42.737517
0.024169
2.416918
0.34396
12.527964
0.448906
15.179948
0.430934
36.770464
false
false
2024-10-14
2024-10-14
1
lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4b-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4b-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4b-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4b-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4b-9B
70dc6ddfaede76ff01584922fca53ba90837cd52
31.290605
2
10
false
false
false
false
2.425017
0.687834
68.783384
0.603916
43.442739
0.102719
10.271903
0.340604
12.080537
0.455479
15.868229
0.435672
37.296838
false
false
2024-10-16
2024-10-22
1
lemon07r/Gemma-2-Ataraxy-v4b-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4c-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4c-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4c-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4c-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4c-9B
26f2619a432266a5f73c135804b1aa34f00ec689
32.878262
4
10
false
false
false
false
2.765238
0.694528
69.45283
0.608432
44.125367
0.194864
19.486405
0.333893
11.185682
0.452781
15.297656
0.439495
37.721631
false
false
2024-10-16
2024-10-16
1
lemon07r/Gemma-2-Ataraxy-v4c-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4d-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4d-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4d-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4d-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4d-9B
24f9ad78e42c92df5277b3aea4deb4083a8625d9
33.172396
gemma
6
10
true
false
false
false
2.685277
0.725003
72.500299
0.605416
43.595239
0.169184
16.918429
0.347315
12.975391
0.454146
15.868229
0.434591
37.176788
true
false
2024-10-25
2024-10-25
1
lemon07r/Gemma-2-Ataraxy-v4d-9B (Merge)
lemon07r_Llama-3-RedMagic4-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Llama-3-RedMagic4-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Llama-3-RedMagic4-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Llama-3-RedMagic4-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Llama-3-RedMagic4-8B
65ee08a0434f1903a8971640fc3cca6c8ae8590e
19.493931
llama3
0
8
true
false
false
true
0.798596
0.486401
48.640053
0.425605
19.475747
0.093656
9.365559
0.290268
5.369128
0.376635
4.379427
0.367603
29.733673
true
false
2024-06-19
2024-06-26
1
lemon07r/Llama-3-RedMagic4-8B (Merge)
lemon07r_llama-3-NeuralMahou-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/llama-3-NeuralMahou-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/llama-3-NeuralMahou-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__llama-3-NeuralMahou-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/llama-3-NeuralMahou-8b
59a0937df85f9d6d65d15dbb4a7c06b6ad8a0305
19.87125
llama3
1
8
true
false
false
true
0.846228
0.490097
49.009739
0.418411
18.692069
0.103474
10.347432
0.288591
5.145414
0.387271
6.142188
0.369016
29.890662
true
false
2024-05-30
2024-06-26
1
lemon07r/llama-3-NeuralMahou-8b (Merge)
lesubra_ECE-EIFFEL-3B_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-EIFFEL-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-EIFFEL-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-EIFFEL-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-EIFFEL-3B
aa56433ac824d245ac82d5e55ce8e589df0711ec
22.013487
apache-2.0
0
3
true
false
false
false
1.539546
0.346941
34.694056
0.510158
31.286439
0.092145
9.214502
0.331376
10.850112
0.436229
14.695313
0.382064
31.340499
true
false
2024-10-01
2024-10-01
0
lesubra/ECE-EIFFEL-3B
lesubra_ECE-EIFFEL-3Bv2_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-EIFFEL-3Bv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-EIFFEL-3Bv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-EIFFEL-3Bv2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-EIFFEL-3Bv2
b059d1a0d49f09d6df34d93f133d24f6641bc535
22.108866
apache-2.0
0
3
true
false
false
false
0.860345
0.301303
30.130277
0.542401
36.353133
0.056647
5.664653
0.33557
11.409396
0.444292
15.769792
0.399934
33.325946
true
false
2024-10-03
2024-10-03
0
lesubra/ECE-EIFFEL-3Bv2
lesubra_ECE-EIFFEL-3Bv3_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-EIFFEL-3Bv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-EIFFEL-3Bv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-EIFFEL-3Bv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-EIFFEL-3Bv3
2cd31e58d38b96626a8a83192b5d2eec6669f5e2
25.337582
apache-2.0
0
3
true
false
false
false
0.717147
0.378614
37.86143
0.546945
36.464083
0.1571
15.70997
0.329698
10.626398
0.46751
18.305469
0.397523
33.058141
true
false
2024-10-07
2024-10-07
0
lesubra/ECE-EIFFEL-3Bv3
lesubra_ECE-PRYMMAL-3B-SLERP-V1_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP-V1
e46f1de93f10b1a57f9175653fd29dda355a61e6
22.002429
apache-2.0
0
3
true
false
false
false
0.713594
0.293284
29.328404
0.534059
35.053068
0.098187
9.818731
0.317114
8.948546
0.45951
16.638802
0.390043
32.227024
true
false
2024-10-28
2024-10-28
0
lesubra/ECE-PRYMMAL-3B-SLERP-V1
lesubra_ECE-PRYMMAL-3B-SLERP-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP-V2
ba617ea0b1ed5497f62bf49635c30bcfb0547133
22.002429
apache-2.0
0
3
true
false
false
false
0.730325
0.293284
29.328404
0.534059
35.053068
0.098187
9.818731
0.317114
8.948546
0.45951
16.638802
0.390043
32.227024
true
false
2024-10-28
2024-10-28
0
lesubra/ECE-PRYMMAL-3B-SLERP-V2
lesubra_ECE-PRYMMAL-3B-SLERP_2-V1_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP_2-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP_2-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP_2-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP_2-V1
354e5c732dd2fde016da1e33a018d2d2787f7805
24.634133
apache-2.0
0
3
true
false
false
false
0.639428
0.364901
36.490069
0.541145
35.710681
0.148036
14.803625
0.321309
9.50783
0.466146
18.068229
0.399019
33.224365
true
false
2024-11-06
2024-11-06
0
lesubra/ECE-PRYMMAL-3B-SLERP_2-V1
lesubra_ECE-PRYMMAL-3B-SLERP_2-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP_2-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP_2-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP_2-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP_2-V2
d5074a951206f946a6be331a74bd4fa381d348eb
24.659529
apache-2.0
0
3
true
false
false
false
0.535517
0.366424
36.642442
0.541145
35.710681
0.148036
14.803625
0.321309
9.50783
0.466146
18.068229
0.399019
33.224365
true
false
2024-11-06
2024-11-06
0
lesubra/ECE-PRYMMAL-3B-SLERP_2-V2
lesubra_merge-test_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/merge-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/merge-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__merge-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/merge-test
39895c64dd646443719873a2ab2b19d3afe4f86c
25.735642
apache-2.0
0
3
true
false
false
true
0.968526
0.538257
53.825738
0.524043
33.353311
0.100453
10.045317
0.322148
9.619687
0.441906
15.638281
0.387384
31.931516
true
false
2024-09-27
2024-09-27
0
lesubra/merge-test
lightblue_suzume-llama-3-8B-multilingual_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual
0cb15aa9ec685eef494f9a15f65aefcfe3c04c66
23.885601
other
105
8
true
false
false
true
0.84099
0.6678
66.780033
0.494995
28.895092
0.088369
8.836858
0.283557
4.474273
0.397687
7.844271
0.338348
26.483082
false
false
2024-04-23
2024-07-30
1
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-full_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-full-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full
ac04e23fb8861c188f8ecddfecc4250b40aee04d
19.571597
cc-by-nc-4.0
2
8
true
false
false
true
0.802567
0.581746
58.174643
0.471422
25.075475
0.032477
3.247734
0.259228
1.230425
0.322188
4.040104
0.330951
25.6612
false
false
2024-04-25
2024-07-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-half_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-half-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half
b82150a9840ba5ba93918c745adc70afc6ad2ce1
21.409092
cc-by-nc-4.0
14
8
true
false
false
true
0.886337
0.624911
62.491079
0.470746
26.348598
0.084592
8.459215
0.244966
0
0.351583
2.114583
0.36137
29.041076
false
false
2024-04-25
2024-06-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-top25_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-top25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25
5a2f17238cc83932e00613d285f8bf6b8f4a0c3a
23.533711
cc-by-nc-4.0
3
8
true
false
false
true
0.834868
0.663654
66.365355
0.486464
27.665285
0.095166
9.516616
0.272651
3.020134
0.356604
4.808854
0.368434
29.82602
false
false
2024-04-26
2024-06-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-top75_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-top75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75
555f4a0092f239557e1aa34f9d489e8156b907bb
23.596767
cc-by-nc-4.0
2
8
true
false
false
true
0.946504
0.668725
66.872454
0.483332
28.056256
0.075529
7.55287
0.272651
3.020134
0.381688
5.310938
0.376912
30.767952
false
false
2024-04-26
2024-06-29
2
meta-llama/Meta-Llama-3-8B-Instruct
llmat_Mistral-v0.3-7B-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/llmat/Mistral-v0.3-7B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmat/Mistral-v0.3-7B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llmat__Mistral-v0.3-7B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmat/Mistral-v0.3-7B-ORPO
868d8a51e8deb6fd948eabe5bc296c53bcf41073
12.084587
apache-2.0
1
7
true
false
false
true
1.259016
0.377041
37.70407
0.397766
14.863159
0.005287
0.528701
0.266779
2.237136
0.355521
2.973438
0.227809
14.20102
false
false
2024-08-04
2024-09-02
2
mistralai/Mistral-7B-v0.3
llmat_Mistral-v0.3-7B-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/llmat/Mistral-v0.3-7B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmat/Mistral-v0.3-7B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llmat__Mistral-v0.3-7B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmat/Mistral-v0.3-7B-ORPO
868d8a51e8deb6fd948eabe5bc296c53bcf41073
12.024322
apache-2.0
1
7
true
false
false
true
0.626818
0.363976
36.397647
0.400466
15.591491
0.001511
0.151057
0.269295
2.572707
0.352854
2.973438
0.230136
14.459589
false
false
2024-08-04
2024-08-06
2
mistralai/Mistral-7B-v0.3
llnYou_ECE-PRYMMAL-YL-1B-SLERP-V5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-1B-SLERP-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5
6facb36cea2f670e32d6571846f00aa4cf5aaa86
15.509045
apache-2.0
0
1
true
false
false
false
0.640178
0.331253
33.12533
0.423295
18.879659
0.09139
9.138973
0.286074
4.809843
0.386802
5.65026
0.293052
21.450207
false
false
2024-11-12
2024-11-12
0
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5
llnYou_ECE-PRYMMAL-YL-1B-SLERP-V6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-1B-SLERP-V6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6
f15fb39e40475348e7d349c3ec2f346ffca39377
9.357509
apache-2.0
0
1
true
false
false
false
0.555415
0.138762
13.876182
0.394403
14.538923
0
0
0.290268
5.369128
0.392792
7.365625
0.234957
14.995198
false
false
2024-11-13
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6
llnYou_ECE-PRYMMAL-YL-3B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-3B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1
4918220543f4923137d20204a5ea396f65f6b956
11.475737
apache-2.0
0
2
true
false
false
false
0.579388
0.234633
23.4633
0.401842
15.797462
0
0
0.293624
5.816555
0.336448
3.222656
0.28499
20.554447
false
false
2024-11-12
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1
llnYou_ECE-PRYMMAL-YL-3B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-3B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2
c3d4fbef1a10ef2746c47c0379b4247c784758e5
11.59947
apache-2.0
0
2
true
false
false
false
0.543809
0.230936
23.093614
0.398977
15.202244
0
0
0.276846
3.579418
0.358771
6.613021
0.289977
21.108525
false
false
2024-11-12
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2
llnYou_ECE-PRYMMAL-YL-3B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-3B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3
90648507743059de96334fdc97309b6f2af3d01d
22.910742
apache-2.0
0
3
true
false
false
false
0.543325
0.358081
35.8081
0.547312
36.625756
0.098943
9.89426
0.30453
7.270694
0.436135
14.05026
0.404338
33.815381
false
false
2024-11-13
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3
lmsys_vicuna-13b-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-13b-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-v1.3
6566e9cb1787585d1147dcf4f9bc48f29e1328d2
10.284476
195
13
true
false
false
true
1.094233
0.334351
33.435063
0.33844
7.489789
0.005287
0.528701
0.267617
2.348993
0.372729
4.091146
0.224318
13.813165
false
true
2023-06-18
2024-06-28
0
lmsys/vicuna-13b-v1.3
lmsys_vicuna-7b-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-7b-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.3
236eeeab96f0dc2e463f2bebb7bb49809279c6d6
8.311811
129
7
true
false
false
true
0.563378
0.290862
29.086158
0.329841
6.461379
0
0
0.24245
0
0.379333
5.016667
0.18376
9.306664
false
true
2023-06-18
2024-06-28
0
lmsys/vicuna-7b-v1.3
lmsys_vicuna-7b-v1.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-7b-v1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.5
3321f76e3f527bd14065daf69dad9344000a201d
10.784447
llama2
310
7
true
false
false
false
0.602718
0.235157
23.515716
0.394704
15.152509
0.007553
0.755287
0.258389
1.118568
0.423115
11.422656
0.214678
12.741947
false
true
2023-07-29
2024-06-12
0
lmsys/vicuna-7b-v1.5
lodrick-the-lafted_llama-3.1-8b-instruct-ortho-v7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lodrick-the-lafted__llama-3.1-8b-instruct-ortho-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7
6b7673cd78398c3a8c92f8e759aaae6409e96978
11.661762
wtfpl
0
8
true
false
false
false
0.931574
0.351462
35.14619
0.390691
14.437863
0.018127
1.812689
0.272651
3.020134
0.361594
4.732552
0.19739
10.821144
false
false
2024-07-25
2024-07-30
0
lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7
lordjia_Llama-3-Cantonese-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lordjia/Llama-3-Cantonese-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lordjia/Llama-3-Cantonese-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lordjia__Llama-3-Cantonese-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lordjia/Llama-3-Cantonese-8B-Instruct
ea98e9b1ab3ea0d66e5270816e43d7a70aaaa151
24.259121
llama3
4
8
true
false
false
true
0.767703
0.666926
66.692598
0.481415
26.791039
0.088369
8.836858
0.293624
5.816555
0.404604
9.475521
0.351479
27.942154
false
false
2024-07-16
2024-08-03
0
lordjia/Llama-3-Cantonese-8B-Instruct
lordjia_Qwen2-Cantonese-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lordjia/Qwen2-Cantonese-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lordjia/Qwen2-Cantonese-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lordjia__Qwen2-Cantonese-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lordjia/Qwen2-Cantonese-7B-Instruct
eb8b0faee749d167fd70e74f5e579094c4cfe7fb
23.640515
apache-2.0
2
7
true
false
false
true
1.016007
0.543528
54.352784
0.521531
32.453217
0.095921
9.592145
0.295302
6.040268
0.400385
7.814844
0.384309
31.589835
false
false
2024-07-13
2024-08-03
0
lordjia/Qwen2-Cantonese-7B-Instruct
lt-asset_nova-1.3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
NovaForCausalLM
<a target="_blank" href="https://huggingface.co/lt-asset/nova-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lt-asset/nova-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lt-asset__nova-1.3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lt-asset/nova-1.3b
766eb459b5aa1e084b5474bb86ade09f9bed8fca
3.803298
bsd-3-clause-clear
4
1
true
false
false
false
0.247747
0.121426
12.14256
0.317001
4.43762
0.009063
0.906344
0.249161
0
0.369781
3.75599
0.114195
1.577275
false
false
2024-01-20
2024-11-16
0
lt-asset/nova-1.3b
macadeliccc_Samantha-Qwen-2-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/macadeliccc/Samantha-Qwen-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">macadeliccc/Samantha-Qwen-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/macadeliccc__Samantha-Qwen-2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
macadeliccc/Samantha-Qwen-2-7B
59058972fa9b56d132d04589eb17cbba277c2826
24.976969
apache-2.0
2
7
true
false
false
true
1.339807
0.437715
43.771526
0.508234
31.411894
0.206193
20.619335
0.272651
3.020134
0.479948
20.160156
0.377909
30.878768
false
false
2024-06-15
2024-08-05
1
Qwen/Qwen2-7B
macadeliccc_magistrate-3.2-3b-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/macadeliccc/magistrate-3.2-3b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">macadeliccc/magistrate-3.2-3b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/macadeliccc__magistrate-3.2-3b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
macadeliccc/magistrate-3.2-3b-base
2a40ac9ca1904fca2c1e69573e27f0ff8039b738
5.970569
llama3.2
1
3
true
false
false
false
0.730343
0.11593
11.593018
0.33427
6.910281
0.006798
0.679758
0.260906
1.454139
0.397594
7.532552
0.168883
7.653664
false
false
2024-09-28
2024-10-01
1
meta-llama/Llama-3.2-3B
macadeliccc_magistrate-3.2-3b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/macadeliccc/magistrate-3.2-3b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">macadeliccc/magistrate-3.2-3b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/macadeliccc__magistrate-3.2-3b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
macadeliccc/magistrate-3.2-3b-it
122961278c97195dd59d67b244907359013e4de5
7.037724
llama3.2
0
3
true
false
false
true
0.702951
0.229187
22.918744
0.325651
5.323155
0.016616
1.661631
0.247483
0
0.376323
5.740365
0.159242
6.582447
false
false
2024-10-01
2024-10-01
2
meta-llama/Llama-3.2-3B
maldv_badger-kappa-llama-3-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-kappa-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-kappa-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-kappa-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-kappa-llama-3-8b
aa6863eb816ca6ad29453b8aaf846962c4328998
21.166688
llama3
2
8
true
false
false
true
0.959125
0.469464
46.946435
0.508493
30.153239
0.086103
8.610272
0.302852
7.04698
0.37651
4.297135
0.369515
29.94607
false
false
2024-06-02
2024-06-27
0
maldv/badger-kappa-llama-3-8b
maldv_badger-lambda-llama-3-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-lambda-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-lambda-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-lambda-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-lambda-llama-3-8b
8ef157d0d3c12212ca5e70d354869aed90e03f22
20.893498
cc-by-nc-4.0
10
8
true
false
false
true
1.111022
0.486076
48.607583
0.496349
28.10305
0.09139
9.138973
0.281879
4.250559
0.375365
4.520573
0.376662
30.740248
false
false
2024-06-10
2024-06-26
0
maldv/badger-lambda-llama-3-8b
maldv_badger-mu-llama-3-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-mu-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-mu-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-mu-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-mu-llama-3-8b
952a269bb1e6c18ee772c6d088e74d305df4425d
19.79347
cc-by-nc-4.0
3
8
true
false
false
true
0.904635
0.491946
49.194581
0.514288
30.513965
0.024169
2.416918
0.259228
1.230425
0.355458
5.698958
0.367354
29.705969
false
false
2024-06-27
2024-06-27
0
maldv/badger-mu-llama-3-8b
maldv_badger-writer-llama-3-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-writer-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-writer-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-writer-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-writer-llama-3-8b
1d8134d01af87e994571ae16ccd7b31cce42418f
21.046535
cc-by-nc-4.0
9
8
true
false
false
true
1.235811
0.530314
53.031401
0.486389
26.878361
0.072508
7.250755
0.28943
5.257271
0.358094
3.195052
0.375997
30.666371
true
false
2024-06-17
2024-06-26
1
maldv/badger-writer-llama-3-8b (Merge)
matouLeLoup_ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3
60c5853d376d4b62b19dd4c4741224d0246ec5b4
7.287062
apache-2.0
1
0
true
false
false
false
0.851772
0.187322
18.732186
0.323912
7.918512
0.030211
3.021148
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
false
2024-11-01
2024-11-01
0
matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3
matouLeLoup_ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis
3fd229bcc3b4d2502ed7f3bdd48ccb5c97e83212
7.287062
0
0
false
false
false
false
0.85094
0.187322
18.732186
0.323912
7.918512
0.030211
3.021148
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
false
2024-10-31
2024-10-31
0
matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis
matouLeLoup_ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis
455945ed4318bbeae008a253f877f56a68291b8b
7.287062
1
0
false
false
false
false
0.860742
0.187322
18.732186
0.323912
7.918512
0.030211
3.021148
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
false
2024-10-31
2024-10-31
0
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis
matouLeLoup_ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis
dd86c3d7f77748a0ba18d911ceb93358a69ce160
7.257345
1
0
false
false
false
false
0.903718
0.188246
18.824608
0.323279
8.079577
0.02719
2.719033
0.263423
1.789709
0.368479
4.126563
0.172041
8.00458
false
false
2024-10-25
2024-10-31
0
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis
matouLeLoup_ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis
7a9d848188a674302d64a865786d4508be19571a
5.812746
0
0
false
false
false
true
1.098951
0.165215
16.521496
0.302373
3.083352
0.009063
0.906344
0.256711
0.894855
0.427302
12.179427
0.111619
1.291002
false
false
2024-11-12
2024-11-12
0
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis
mattshumer_Reflection-Llama-3.1-70B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mattshumer/Reflection-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mattshumer/Reflection-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mattshumer__Reflection-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mattshumer/Reflection-Llama-3.1-70B
458962ed801fac4eadd01a91a2029a3a82f4cd84
26.561462
llama3.1
1,709
70
true
false
false
true
28.650846
0.65626
65.625984
0.599884
42.389445
0
0
0.260906
1.454139
0.412667
10.016667
0.458943
39.882535
false
false
2024-09-05
2024-09-09
2
meta-llama/Meta-Llama-3.1-70B
mattshumer_ref_70_e3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mattshumer/ref_70_e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mattshumer/ref_70_e3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mattshumer__ref_70_e3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mattshumer/ref_70_e3
5d2d9dbb9e0bf61879255f63f1b787296fe524cc
30.737996
llama3.1
57
70
true
false
false
true
39.905309
0.629432
62.943213
0.650084
49.274467
0
0
0.33557
11.409396
0.43276
12.995052
0.530253
47.805851
false
false
2024-09-08
2024-09-08
2
meta-llama/Meta-Llama-3.1-70B
maywell_Qwen2-7B-Multilingual-RP_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Qwen2-7B-Multilingual-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Qwen2-7B-Multilingual-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maywell__Qwen2-7B-Multilingual-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Qwen2-7B-Multilingual-RP
487e8f0498419e4d1188f661dbb63bd629be4638
23.463467
apache-2.0
43
7
true
false
false
true
0.959413
0.434718
43.471766
0.506206
30.543561
0.225076
22.507553
0.29698
6.263982
0.369563
6.228646
0.385888
31.765293
false
false
2024-06-24
2024-09-05
0
maywell/Qwen2-7B-Multilingual-RP
meditsolutions_Llama-3.1-MedIT-SUN-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.1-MedIT-SUN-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.1-MedIT-SUN-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.1-MedIT-SUN-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.1-MedIT-SUN-8B
0c11abbaa40e76b538b8c0f9c50e965078999087
30.043102
llama3.1
0
8
true
false
false
true
0.71305
0.783729
78.372939
0.518692
32.001651
0.200151
20.015106
0.308725
7.829978
0.405625
9.636458
0.391622
32.402482
false
false
2024-11-06
2024-11-06
1
meditsolutions/Llama-3.1-MedIT-SUN-8B (Merge)
meditsolutions_Llama-3.2-SUN-1B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaMedITForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-1B-Instruct
538477c528ecd80f9537b0d4ea730b7b9b529115
15.108889
llama3.2
4
1
true
false
false
true
0.351744
0.641297
64.129731
0.34739
9.183739
0.046073
4.607251
0.24245
0
0.351365
4.053906
0.178108
8.678709
false
false
2024-11-27
2024-11-27
1
meditsolutions/Llama-3.2-SUN-1B-Instruct (Merge)
meditsolutions_Llama-3.2-SUN-1B-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-1B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-1B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-1B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-1B-chat
a67791cfc31d09c3e96bd8c62a386f6107378087
13.377015
llama3.2
1
1
true
false
false
true
0.727851
0.548174
54.81744
0.351446
8.690238
0.048338
4.833837
0.261745
1.565996
0.324917
1.047917
0.18376
9.306664
false
false
2024-11-03
2024-11-07
1
meditsolutions/Llama-3.2-SUN-1B-chat (Merge)
meditsolutions_Llama-3.2-SUN-2.4B-checkpoint-26000_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.4B-checkpoint-26000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000
1300885555ca8bbed20a57cf0ec9f7ae014200c3
7.954663
llama3.2
1
2
true
false
false
true
0.81317
0.281394
28.139448
0.301775
2.895305
0.006798
0.679758
0.277685
3.691275
0.410333
8.491667
0.134475
3.830526
false
false
2024-09-27
2024-10-04
1
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000 (Merge)
meditsolutions_Llama-3.2-SUN-2.4B-checkpoint-34800_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.4B-checkpoint-34800-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800
ef65f05f577a69a1992349c8d33c96cd099844f7
8.042045
llama3.2
1
2
true
false
false
true
0.815533
0.250095
25.00953
0.316112
5.46618
0.001511
0.151057
0.286074
4.809843
0.40224
8.846615
0.135721
3.969046
false
false
2024-09-27
2024-10-05
1
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800 (Merge)
meditsolutions_Llama-3.2-SUN-2.4B-v1.0.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.4B-v1.0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0
b8a31c62ab4acbd4c645fd882d899c4ec7280677
12.97725
llama3.2
1
2
true
false
false
true
5.271012
0.563687
56.368657
0.339083
7.211668
0.042296
4.229607
0.25755
1.006711
0.320948
3.01849
0.154255
6.028369
false
false
2024-09-27
2024-10-20
1
meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0 (Merge)
meditsolutions_Llama-3.2-SUN-2.5B-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.5B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.5B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.5B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.5B-chat
2bd68a18c0f7984f430acbc2efad76344177aba0
13.647831
llama3.2
1
2
true
false
false
true
2.194443
0.560414
56.041415
0.357473
9.409093
0.050604
5.060423
0.259228
1.230425
0.315521
1.106771
0.18135
9.038859
false
false
2024-09-27
2024-10-26
1
meditsolutions/Llama-3.2-SUN-2.5B-chat (Merge)
meditsolutions_MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune
a0ffd0cd00cab2245c1f0edcef4d1d8ead4c6d6e
14.377117
apache-2.0
0
7
true
false
false
true
1.0591
0.3655
36.550021
0.403485
16.138426
0.017372
1.73716
0.302852
7.04698
0.425344
11.567969
0.218999
13.222148
false
false
2024-11-12
2024-11-13
1
meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune (Merge)
meditsolutions_MSH-v1-Bielik-v2.3-Instruct-MedIT-merge_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__MSH-v1-Bielik-v2.3-Instruct-MedIT-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge
2db5e8871fb3be7e658e3bc6e2885d26b891b8b8
27.380011
apache-2.0
1
11
true
false
false
true
0.842741
0.581422
58.142174
0.567172
38.023435
0.137462
13.746224
0.345638
12.751678
0.438458
13.840625
0.349983
27.775931
false
false
2024-10-29
2024-11-06
1
meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge (Merge)
meditsolutions_MedIT-Mesh-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/MedIT-Mesh-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/MedIT-Mesh-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__MedIT-Mesh-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/MedIT-Mesh-3B-Instruct
469d1a58f7747c3d456b3308b5a7042df4ab49e3
27.562941
mit
0
3
true
false
false
true
0.530486
0.581422
58.142174
0.557552
37.547054
0.157855
15.785498
0.323826
9.8434
0.40476
10.595052
0.40118
33.464465
false
false
2024-11-01
2024-11-01
1
meditsolutions/MedIT-Mesh-3B-Instruct (Merge)
meditsolutions_SmolLM2-MedIT-Upscale-2B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/SmolLM2-MedIT-Upscale-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/SmolLM2-MedIT-Upscale-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__SmolLM2-MedIT-Upscale-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/SmolLM2-MedIT-Upscale-2B
5696c9ea7cbdee0f8ad1845f5a2dc7309f376143
15.167247
apache-2.0
4
2
true
false
false
true
0.336142
0.642921
64.292078
0.355112
10.514326
0.010574
1.057402
0.264262
1.901566
0.331365
2.453906
0.197058
10.784205
false
false
2024-12-02
2024-12-02
1
meditsolutions/SmolLM2-MedIT-Upscale-2B (Merge)
meetkai_functionary-small-v3.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meetkai/functionary-small-v3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meetkai/functionary-small-v3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meetkai__functionary-small-v3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meetkai/functionary-small-v3.1
8e43bc1d2e259b91799e704c410a95b8ca458121
21.641241
mit
16
8
true
false
false
true
0.70437
0.627458
62.745848
0.498178
28.616315
0.010574
1.057402
0.288591
5.145414
0.383365
6.18724
0.334857
26.095228
false
false
2024-07-26
2024-11-10
0
meetkai/functionary-small-v3.1
meraGPT_mera-mix-4x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/meraGPT/mera-mix-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meraGPT/mera-mix-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meraGPT__mera-mix-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meraGPT/mera-mix-4x7B
09d965c5ef9b66ce419986027e03a915cb869e43
17.854959
apache-2.0
18
24
true
false
false
true
1.664401
0.483178
48.317797
0.401899
17.486439
0.053625
5.362538
0.30453
7.270694
0.405656
9.273698
0.274767
19.418587
false
false
2024-04-13
2024-06-27
0
meraGPT/mera-mix-4x7B
meta-llama_Llama-2-13b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-chat-hf
a2cb7a712bb6e5e736ca7f8cd98167f81a0b5bd8
11.016342
llama2
1,033
13
true
false
false
true
0.87457
0.398473
39.847272
0.334274
7.15538
0.006798
0.679758
0.231544
0
0.400729
8.157813
0.19232
10.257831
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-13b-chat-hf
meta-llama_Llama-2-13b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-hf
5c31dfb671ce7cfe2d7bb7c04375e44c55e815b1
11.014834
llama2
574
13
true
false
false
false
1.11238
0.248247
24.824687
0.412562
17.22256
0.012085
1.208459
0.28104
4.138702
0.35375
3.385417
0.237783
15.309176
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-13b-hf
meta-llama_Llama-2-70b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-chat-hf
e9149a12809580e8602995856f8098ce973d1080
12.746405
llama2
2,165
68
true
false
false
true
22.898455
0.495792
49.579228
0.304247
4.613767
0.009819
0.981873
0.264262
1.901566
0.368667
3.483333
0.243268
15.918661
false
true
2023-07-14
2024-06-12
0
meta-llama/Llama-2-70b-chat-hf
meta-llama_Llama-2-70b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-hf
3aba440b59558f995867ba6e1f58f21d0336b5bb
18.309658
llama2
836
68
true
false
false
false
29.621247
0.240678
24.067807
0.547259
35.900062
0.028701
2.870091
0.302852
7.04698
0.412354
9.777604
0.371759
30.195405
false
true
2023-07-11
2024-06-12
0
meta-llama/Llama-2-70b-hf
meta-llama_Llama-2-7b-chat-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-chat-hf
f5db02db724555f92da89c216ac04704f23d4590
9.395485
llama2
4,049
6
true
false
false
true
1.156879
0.398648
39.864781
0.311355
4.459172
0.006798
0.679758
0.253356
0.447427
0.367552
3.277344
0.1688
7.64443
false
true
2023-07-13
2024-08-30
0
meta-llama/Llama-2-7b-chat-hf
meta-llama_Llama-2-7b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-hf
01c7f73d771dfac7d292323805ebc428287df4f9
8.730829
llama2
1,837
6
true
false
false
false
0.563095
0.251894
25.189386
0.34962
10.351417
0.01284
1.283988
0.266779
2.237136
0.370062
3.757813
0.186087
9.565233
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-7b-hf
meta-llama_Llama-3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-8B
d04e592bb4f6aa9cfee91e2e20afa771667e1d4b
14.219455
llama3.1
1,178
8
true
false
false
false
0.713244
0.124598
12.459829
0.465959
25.304471
0.053625
5.362538
0.310403
8.053691
0.381188
8.715104
0.32879
25.421099
false
true
2024-07-14
2024-12-07
0
meta-llama/Llama-3.1-8B
meta-llama_Llama-3.2-1B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-1B
a7c18587d7f473bfea02aa5639aa349403307b54
4.031494
llama3.2
1,212
1
true
false
false
false
0.419129
0.147779
14.7779
0.311495
4.36603
0.002266
0.226586
0.228188
0
0.344729
2.557813
0.120346
2.260638
false
true
2024-09-18
2024-09-23
0
meta-llama/Llama-3.2-1B
meta-llama_Llama-3.2-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-1B-Instruct
d0a2081ed47e20ce524e8bc5d132f3fad2f69ff0
13.81372
llama3.2
619
1
true
false
false
true
0.404905
0.569831
56.983138
0.349685
8.742521
0.032477
3.247734
0.275168
3.355705
0.332854
2.973438
0.168218
7.579787
false
true
2024-09-18
2024-09-23
0
meta-llama/Llama-3.2-1B-Instruct
meta-llama_Llama-3.2-3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-3B
95c102307f55fbd6d18ddf28bfbcb537ffdc2806
8.58453
llama3.2
391
3
true
false
false
false
1.331813
0.133741
13.37407
0.390512
14.232665
0.012085
1.208459
0.267617
2.348993
0.357719
3.814844
0.248753
16.528147
false
true
2024-09-18
2024-09-27
0
meta-llama/Llama-3.2-3B
meta-llama_Llama-3.2-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-3B-Instruct
276b29ce8303c9b88966a9b32fc75692dce4d8e1
24.116534
llama3.2
751
3
true
false
false
true
1.271204
0.739316
73.931613
0.461007
24.059186
0.17145
17.145015
0.278523
3.803132
0.352854
1.373437
0.319481
24.38682
false
true
2024-09-18
2024-09-27
0
meta-llama/Llama-3.2-3B-Instruct
meta-llama_Llama-3.3-70B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.3-70B-Instruct
36.828841
llama3.3
760
70
true
false
false
false
38.279537
0.899758
89.97582
0.691931
56.561411
0.002266
0.226586
0.328859
10.514541
0.446125
15.565625
0.533162
48.129063
false
true
2024-11-26
2024-12-09
1
meta-llama/Llama-3.3-70B-Instruct (Merge)
meta-llama_Meta-Llama-3-70B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B
b4d08b7db49d488da3ac49adf25a6b9ac01ae338
26.667586
llama3
834
70
true
false
false
false
23.407186
0.160319
16.031906
0.646107
48.709813
0.183535
18.353474
0.397651
19.686801
0.451823
16.011198
0.470911
41.212323
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-70B
meta-llama_Meta-Llama-3-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B-Instruct
7129260dd854a80eb10ace5f61c20324b472b31c
36.510693
llama3
1,439
70
true
false
false
true
18.23915
0.809908
80.990771
0.65467
50.185133
0.253021
25.302115
0.286913
4.9217
0.415365
10.920573
0.520695
46.743868
false
true
2024-04-17
2024-06-12
1
meta-llama/Meta-Llama-3-70B
meta-llama_Meta-Llama-3-8B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B
62bd457b6fe961a42a631306577e622c83876cb6
13.463212
llama3
5,897
8
true
false
false
false
0.872568
0.145506
14.550615
0.459791
24.500764
0.035498
3.549849
0.305369
7.38255
0.361406
6.242448
0.320977
24.553044
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-8B
meta-llama_Meta-Llama-3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
e1945c40cd546c78e41f1151f4db032b271faeaa
23.908736
llama3
3,677
8
true
false
false
true
0.7975
0.74084
74.083986
0.498871
28.24495
0.086858
8.685801
0.259228
1.230425
0.356823
1.602865
0.366439
29.604388
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-8B-Instruct
meta-llama_Meta-Llama-3-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
e1945c40cd546c78e41f1151f4db032b271faeaa
20.596571
llama3
3,677
8
true
false
false
false
0.949473
0.478232
47.82322
0.491026
26.795284
0.090634
9.063444
0.292785
5.704698
0.380542
5.401042
0.359126
28.791741
false
true
2024-04-17
2024-07-08
0
meta-llama/Meta-Llama-3-8B-Instruct
meta-llama_Meta-Llama-3.1-70B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3.1-70B
f7d3cc45ed4ff669a354baf2e0f05e65799a0bee
26.200216
llama3.1
319
70
true
false
false
true
13.601852
0.168438
16.843752
0.626007
46.399413
0.18429
18.429003
0.387584
18.344519
0.457188
16.581771
0.465426
40.602837
false
true
2024-07-14
2024-07-23
0
meta-llama/Meta-Llama-3.1-70B
meta-llama_Meta-Llama-3.1-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3.1-70B-Instruct
b9461463b511ed3c0762467538ea32cf7c9669f2
42.176313
llama3.1
744
70
true
false
false
true
26.802016
0.866885
86.688542
0.691729
55.927992
0.306647
30.664653
0.356544
14.205817
0.458063
17.691146
0.530918
47.879728
false
true
2024-07-16
2024-08-15
1
meta-llama/Meta-Llama-3.1-70B
meta-llama_Meta-Llama-3.1-8B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3.1-8B
e5c39e551424c763dbc3e58e32ef2999d33a6d8d
13.869066
llama3.1
1,178
8
true
false
false
true
3.598523
0.126996
12.699637
0.466614
25.29478
0.05136
5.135952
0.296141
6.152125
0.382521
8.981771
0.324551
24.950133
false
true
2024-07-14
2024-07-23
0
meta-llama/Meta-Llama-3.1-8B