eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Qwen_Qwen1.5-1.8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-1.8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-1.8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-1.8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-1.8B-Chat
e482ee3f73c375a627a16fdf66fd0c8279743ca6
9.257783
other
50
1.837
true
false
false
true
1.128658
0.20191
20.190982
0.325591
5.908663
0.019637
1.963746
0.297819
6.375839
0.425969
12.179427
0.180352
8.928044
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-1.8B-Chat
Qwen_Qwen1.5-110B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-110B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-110B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-110B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-110B
16659038ecdcc771c1293cf47020fa7cc2750ee8
29.833678
other
97
111.21
true
false
false
false
142.541777
0.342194
34.219427
0.609996
44.280477
0.246979
24.697885
0.352349
13.646532
0.440844
13.705469
0.53607
48.452275
false
true
2024-04-25
2024-06-13
0
Qwen/Qwen1.5-110B
Qwen_Qwen1.5-110B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-110B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-110B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-110B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-110B-Chat
85f86cec25901f2dbd870a86e06756903c9a876a
33.127153
other
125
111.21
true
false
false
true
145.130586
0.593886
59.388644
0.61838
44.984545
0.234139
23.413897
0.341443
12.192394
0.452167
16.2875
0.482463
42.495937
false
true
2024-04-25
2024-06-12
0
Qwen/Qwen1.5-110B-Chat
Qwen_Qwen1.5-14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-14B
dce4b190d34470818e5bec2a92cb8233aaa02ca2
20.85408
other
39
14.167
true
false
false
false
3.850981
0.290537
29.053689
0.508033
30.063103
0.202417
20.241692
0.294463
5.928412
0.418646
10.464063
0.364362
29.373522
false
true
2024-01-22
2024-06-13
0
Qwen/Qwen1.5-14B
Qwen_Qwen1.5-14B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-14B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-14B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-14B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-14B-Chat
9492b22871f43e975435455f5c616c77fe7a50ec
23.566106
other
112
14.167
true
false
false
true
2.676932
0.476808
47.68082
0.522859
32.756479
0.152568
15.256798
0.270134
2.684564
0.439979
13.930729
0.361785
29.087249
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-14B-Chat
Qwen_Qwen1.5-32B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-32B
cefef80dc06a65f89d1d71d0adbc56d335ca2490
27.298756
other
85
32.512
true
false
false
false
119.934319
0.32973
32.972956
0.571539
38.980352
0.30287
30.287009
0.329698
10.626398
0.427792
12.040625
0.449967
38.885195
false
true
2024-04-01
2024-06-13
0
Qwen/Qwen1.5-32B
Qwen_Qwen1.5-32B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-32B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-32B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-32B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-32B-Chat
0997b012af6ddd5465d40465a8415535b2f06cfc
29.257468
other
109
32.512
true
false
false
true
92.118899
0.55322
55.32199
0.60669
44.554854
0.195619
19.561934
0.306208
7.494407
0.415979
10.197396
0.445728
38.414229
false
true
2024-04-03
2024-06-12
0
Qwen/Qwen1.5-32B-Chat
Qwen_Qwen1.5-4B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-4B
a66363a0c24e2155c561e4b53c658b1d3965474e
11.768183
other
36
3.95
true
false
false
false
3.277364
0.244475
24.447466
0.40539
16.249143
0.05287
5.287009
0.276846
3.579418
0.360448
4.822656
0.246011
16.223404
false
true
2024-01-22
2024-06-13
0
Qwen/Qwen1.5-4B
Qwen_Qwen1.5-4B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-4B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-4B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-4B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-4B-Chat
a7a4d4945d28bac955554c9abd2f74a71ebbf22f
12.62728
other
40
3.95
true
false
false
true
1.732301
0.315666
31.566577
0.400555
16.297079
0.027946
2.794562
0.266779
2.237136
0.397781
7.35599
0.239611
15.512337
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-4B-Chat
Qwen_Qwen1.5-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-7B
831096e3a59a0789a541415da25ef195ceb802fe
16.024674
other
52
7.721
true
false
false
false
3.654708
0.26843
26.842999
0.45599
23.075769
0.0929
9.29003
0.298658
6.487696
0.410333
9.158333
0.291639
21.293218
false
true
2024-01-22
2024-06-09
0
Qwen/Qwen1.5-7B
Qwen_Qwen1.5-7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-7B-Chat
5f4f5e69ac7f1d508f8369e977de208b4803444b
17.620987
other
167
7.721
true
false
false
true
2.157653
0.437116
43.711574
0.451005
22.37913
0.062689
6.268882
0.302852
7.04698
0.377906
4.638281
0.29513
21.681073
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-7B-Chat
Qwen_Qwen1.5-MoE-A2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-MoE-A2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-MoE-A2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-MoE-A2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-MoE-A2.7B
1a758c50ecb6350748b9ce0a99d2352fd9fc11c9
13.94592
other
199
14.316
true
true
false
false
19.091226
0.265982
26.598204
0.411352
18.837859
0.0929
9.29003
0.259228
1.230425
0.401344
7.967969
0.277759
19.751034
false
true
2024-02-29
2024-06-13
0
Qwen/Qwen1.5-MoE-A2.7B
Qwen_Qwen1.5-MoE-A2.7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-MoE-A2.7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-MoE-A2.7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-MoE-A2.7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-MoE-A2.7B-Chat
ec052fda178e241c7c443468d2fa1db6618996be
15.8809
other
119
14.316
true
true
false
true
17.803943
0.379539
37.953851
0.427209
20.041819
0.063444
6.344411
0.274329
3.243848
0.389875
6.334375
0.292304
21.367095
false
true
2024-03-14
2024-06-12
0
Qwen/Qwen1.5-MoE-A2.7B-Chat
Qwen_Qwen2-0.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-0.5B
ff3a49fac17555b8dfc4db6709f480cc8f16a9fe
7.224121
apache-2.0
137
0.494
true
false
false
false
2.632362
0.187322
18.732186
0.323912
7.918512
0.026435
2.643505
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
true
2024-05-31
2024-11-30
0
Qwen/Qwen2-0.5B
Qwen_Qwen2-0.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-0.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-0.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-0.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-0.5B-Instruct
c291d6fce4804a1d39305f388dd32897d1f7acc4
6.586781
apache-2.0
179
0.494
true
false
false
true
1.115695
0.224666
22.466611
0.317252
5.876044
0.028701
2.870091
0.246644
0
0.335271
2.408854
0.153092
5.899084
false
true
2024-06-03
2024-06-12
1
Qwen/Qwen2-0.5B
Qwen_Qwen2-1.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-1.5B
8a16abf2848eda07cc5253dec660bf1ce007ad7a
10.445453
apache-2.0
90
1.544
true
false
false
false
2.21639
0.211327
21.132706
0.357479
11.781834
0.070242
7.024169
0.264262
1.901566
0.365813
3.593229
0.255153
17.239214
false
true
2024-05-31
2024-06-09
0
Qwen/Qwen2-1.5B
Qwen_Qwen2-1.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-1.5B-Instruct
ba1cf1846d7df0a0591d6c00649f57e798519da8
14.141937
apache-2.0
140
1.544
true
false
false
true
1.317648
0.337123
33.712328
0.385223
13.695347
0.071752
7.175227
0.261745
1.565996
0.429281
12.026823
0.250083
16.675901
false
true
2024-06-03
2024-06-12
0
Qwen/Qwen2-1.5B-Instruct
Qwen_Qwen2-57B-A14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-57B-A14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-57B-A14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-57B-A14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-57B-A14B
973e466c39ba76372a2ae464dbca0af3f5a5a2a9
25.033873
apache-2.0
50
57.409
true
true
false
false
107.031477
0.31127
31.126965
0.56182
38.875989
0.186556
18.655589
0.306208
7.494407
0.417375
10.538542
0.491606
43.511746
false
true
2024-05-22
2024-06-13
0
Qwen/Qwen2-57B-A14B
Qwen_Qwen2-57B-A14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-57B-A14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-57B-A14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-57B-A14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-57B-A14B-Instruct
5ea455a449e61a92a5b194ee06be807647d3e8b5
33.015869
apache-2.0
80
57.409
true
false
false
true
85.012495
0.633778
63.377837
0.588761
41.785918
0.281722
28.172205
0.331376
10.850112
0.436135
14.183594
0.45753
39.725547
false
true
2024-06-04
2024-08-14
1
Qwen/Qwen2-57B-A14B
Qwen_Qwen2-72B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-72B
87993795c78576318087f70b43fbf530eb7789e7
35.456671
other
200
72.706
true
false
false
false
128.124558
0.382361
38.236102
0.661734
51.856131
0.311178
31.117825
0.394295
19.239374
0.470365
19.728906
0.573055
52.561687
false
true
2024-05-22
2024-06-26
0
Qwen/Qwen2-72B
Qwen_Qwen2-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-72B-Instruct
1af63c698f59c4235668ec9c1395468cb7cd7e79
43.594062
other
713
72.706
true
false
false
false
75.107949
0.798917
79.891687
0.697731
57.483009
0.417674
41.767372
0.372483
16.331096
0.45601
17.167969
0.540309
48.923242
false
true
2024-05-28
2024-06-26
1
Qwen/Qwen2-72B
Qwen_Qwen2-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-7B
453ed1575b739b5b03ce3758b23befdb0967f40e
23.925162
apache-2.0
154
7.616
true
false
false
false
2.561165
0.314867
31.486678
0.531532
34.711136
0.203927
20.392749
0.30453
7.270694
0.443917
14.322917
0.418301
35.3668
false
true
2024-06-04
2024-06-09
0
Qwen/Qwen2-7B
Qwen_Qwen2-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-7B-Instruct
41c66b0be1c3081f13defc6bdf946c2ef240d6a6
27.936688
apache-2.0
622
7.616
true
false
false
true
2.084078
0.567908
56.79076
0.554478
37.808391
0.276435
27.643505
0.297819
6.375839
0.392792
7.365625
0.384724
31.636008
false
true
2024-06-04
2024-06-12
1
Qwen/Qwen2-7B
Qwen_Qwen2-Math-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-Math-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-Math-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-Math-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-Math-72B-Instruct
5c267882f3377bcfc35882f8609098a894eeeaa8
38.020957
other
88
72.706
true
false
false
true
24.336495
0.569381
56.938146
0.634338
47.9602
0.553625
55.362538
0.368289
15.771812
0.451698
15.728906
0.427277
36.36414
false
true
2024-08-08
2024-08-19
0
Qwen/Qwen2-Math-72B-Instruct
Qwen_Qwen2-Math-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-Math-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-Math-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-Math-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-Math-7B
47a44ff4136da8960adbab02b2326787086bcf6c
12.016921
apache-2.0
14
7.616
true
false
false
true
3.126072
0.268705
26.870481
0.386955
14.064494
0.247734
24.773414
0.263423
1.789709
0.359333
2.416667
0.119681
2.186761
false
true
2024-08-08
2024-08-19
0
Qwen/Qwen2-Math-7B
Qwen_Qwen2-VL-72B-Instruct_bfloat16
bfloat16
🌸 multimodal
🌸
Original
Qwen2VLForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-VL-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-VL-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-VL-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-VL-72B-Instruct
f400120e59a6196b024298b7d09fb517f742db7d
39.53662
other
281
73.406
true
false
false
true
54.499433
0.598233
59.823269
0.694629
56.311234
0.344411
34.441088
0.387584
18.344519
0.449219
15.885677
0.571725
52.413933
false
true
2024-09-17
2024-10-20
1
Qwen/Qwen2-VL-72B-Instruct (Merge)
Qwen_Qwen2-VL-7B-Instruct_bfloat16
bfloat16
🌸 multimodal
🌸
Original
Qwen2VLForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-VL-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-VL-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-VL-7B-Instruct
51c47430f97dd7c74aa1fa6825e68a813478097f
26.493259
apache-2.0
1,160
8.291
true
false
false
true
2.108765
0.459922
45.99219
0.546451
35.877103
0.19864
19.864048
0.319631
9.284116
0.4375
13.554167
0.409491
34.387928
false
true
2024-08-28
2024-10-20
1
Qwen/Qwen2-VL-7B-Instruct (Merge)
Qwen_Qwen2.5-0.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-0.5B
2630d3d2321bc1f1878f702166d1b2af019a7310
6.550068
apache-2.0
232
0.5
true
false
false
false
2.330685
0.162717
16.271715
0.327481
6.953962
0.039275
3.927492
0.246644
0
0.343333
2.083333
0.190575
10.063904
false
true
2024-09-15
2024-09-19
0
Qwen/Qwen2.5-0.5B
Qwen_Qwen2.5-0.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-0.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-0.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-0.5B-Instruct
a8b602d9dafd3a75d382e62757d83d89fca3be54
8.140647
apache-2.0
270
0.5
true
false
false
true
0.630824
0.307123
30.712288
0.334073
8.434864
0
0
0.25755
1.006711
0.332885
0.94401
0.169714
7.746011
false
true
2024-09-16
2024-09-19
1
Qwen/Qwen2.5-0.5B
Qwen_Qwen2.5-0.5B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-0.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-0.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-0.5B-Instruct
7ae557604adf67be50417f59c2c2f167def9a775
10.107544
apache-2.0
270
0.494
true
false
false
true
1.237152
0.315291
31.529121
0.332192
8.169502
0.103474
10.347432
0.259228
1.230425
0.334188
1.373437
0.171958
7.995346
false
true
2024-09-16
2024-10-16
1
Qwen/Qwen2.5-0.5B
Qwen_Qwen2.5-1.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-1.5B
e5dfabbcffd9b0c7b31d89b82c5a6b72e663f32c
13.852701
apache-2.0
89
1.5
true
false
false
false
2.497002
0.26743
26.743042
0.407795
16.660465
0.09139
9.138973
0.285235
4.697987
0.357594
5.265885
0.285489
20.609855
false
true
2024-09-15
2024-09-19
0
Qwen/Qwen2.5-1.5B
Qwen_Qwen2.5-1.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-1.5B-Instruct
5fee7c4ed634dc66c6e318c8ac2897b8b9154536
18.430509
apache-2.0
373
1.5
true
false
false
true
1.374379
0.447557
44.755693
0.428898
19.809786
0.220544
22.054381
0.255872
0.782998
0.366313
3.189063
0.27992
19.991135
false
true
2024-09-17
2024-09-19
1
Qwen/Qwen2.5-1.5B
Qwen_Qwen2.5-14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-14B
83a1904df002b00bc8db6f877821cb77dbb363b0
31.951063
apache-2.0
102
14.77
true
false
false
false
8.733912
0.369446
36.94464
0.616051
45.078312
0.29003
29.003021
0.381711
17.561521
0.45024
15.913281
0.52485
47.2056
false
true
2024-09-15
2024-09-19
0
Qwen/Qwen2.5-14B
Qwen_Qwen2.5-14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-14B-Instruct
f55224c616ca27d4bcf28969a156de12c98981cf
41.309457
apache-2.0
209
14.77
true
false
false
true
3.547301
0.815778
81.577769
0.639045
48.360707
0.547583
54.758308
0.322148
9.619687
0.410063
10.157813
0.490442
43.382462
false
true
2024-09-16
2024-09-18
1
Qwen/Qwen2.5-14B
Qwen_Qwen2.5-14B-Instruct-1M_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-14B-Instruct-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-14B-Instruct-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-14B-Instruct-1M
b0c9f6e6f0123e755d47922c24818e488e21b93f
41.559027
apache-2.0
286
14.77
true
false
false
true
3.282277
0.841356
84.135649
0.619822
45.658281
0.530211
53.021148
0.343121
12.416107
0.418
11.35
0.484957
42.772976
false
true
2025-01-23
2025-01-26
1
Qwen/Qwen2.5-14B
Qwen_Qwen2.5-32B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-32B
ff23665d01c3665be5fdb271d18a62090b65c06d
38.007967
apache-2.0
129
32.764
true
false
false
false
11.749771
0.407665
40.7665
0.677052
53.954753
0.356495
35.649547
0.411913
21.588367
0.497833
22.695833
0.580535
53.392804
false
true
2024-09-15
2024-09-19
0
Qwen/Qwen2.5-32B
Qwen_Qwen2.5-32B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-32B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-32B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-32B-Instruct
70e8dfb9ad18a7d499f765fe206ff065ed8ca197
46.597146
apache-2.0
239
32.764
true
false
false
true
11.504966
0.834612
83.461216
0.691253
56.489348
0.625378
62.537764
0.338087
11.744966
0.426125
13.498958
0.566656
51.850621
false
true
2024-09-17
2024-09-19
1
Qwen/Qwen2.5-32B
Qwen_Qwen2.5-3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-3B
e4aa5ac50aa507415cda96cc99eb77ad0a3d2d34
18.10277
other
83
3.086
true
false
false
false
5.386605
0.268954
26.895415
0.461248
24.304242
0.148036
14.803625
0.297819
6.375839
0.430333
11.758333
0.320313
24.479167
false
true
2024-09-15
2024-09-27
0
Qwen/Qwen2.5-3B
Qwen_Qwen2.5-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-3B-Instruct
82f42baa094a9600e39ccd80d34058aeeb3abbc1
27.161757
other
220
3
true
false
false
true
2.776949
0.647492
64.749199
0.469277
25.801394
0.367825
36.782477
0.272651
3.020134
0.396792
7.565625
0.325465
25.051714
false
true
2024-09-17
2024-09-19
1
Qwen/Qwen2.5-3B
Qwen_Qwen2.5-72B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-72B
587cc4061cf6a7cc0d429d05c109447e5cf063af
38.441144
other
62
72.706
true
false
false
false
36.183853
0.41371
41.371007
0.679732
54.615058
0.391239
39.123867
0.405201
20.693512
0.477125
19.640625
0.596825
55.202793
false
true
2024-09-15
2024-09-19
0
Qwen/Qwen2.5-72B
Qwen_Qwen2.5-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-72B-Instruct
a13fff9ad76700c7ecff2769f75943ba8395b4a7
47.98046
other
777
72.706
true
false
false
true
47.645491
0.863838
86.383795
0.727275
61.873256
0.598187
59.818731
0.375
16.666667
0.420604
11.742188
0.562583
51.398124
false
true
2024-09-16
2024-10-16
1
Qwen/Qwen2.5-72B
Qwen_Qwen2.5-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-7B
57597c00770845ceba45271ba1b24c94bbcc7baf
26.01916
apache-2.0
154
7.616
true
false
false
false
4.68673
0.337448
33.744797
0.54163
35.813473
0.250755
25.075529
0.324664
9.955257
0.442427
14.136719
0.436503
37.389184
false
true
2024-09-15
2024-09-19
0
Qwen/Qwen2.5-7B
Qwen_Qwen2.5-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-7B-Instruct
52e20a6f5f475e5c8f6a8ebda4ae5fa6b1ea22ac
35.200109
apache-2.0
577
7.616
true
false
false
true
3.240591
0.758525
75.852516
0.539423
34.892117
0.5
50
0.291107
5.480984
0.402031
8.453906
0.42869
36.521129
false
true
2024-09-16
2024-09-18
1
Qwen/Qwen2.5-7B
Qwen_Qwen2.5-7B-Instruct-1M_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-7B-Instruct-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-7B-Instruct-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-7B-Instruct-1M
49d7103d61e30d68e8cfdea8c419f3f39e5b9c15
32.763947
apache-2.0
274
7.616
true
false
false
true
1.252675
0.744762
74.476168
0.540394
35.026291
0.433535
43.353474
0.297819
6.375839
0.408698
9.520573
0.350482
27.831339
false
true
2025-01-23
2025-01-26
1
Qwen/Qwen2.5-7B
Qwen_Qwen2.5-Coder-14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-14B
1db30eb5ec86a6e51d8981818ee2910370b3010d
24.829052
apache-2.0
34
14.77
true
false
false
true
7.26525
0.347265
34.726526
0.586486
40.523002
0.225076
22.507553
0.292785
5.704698
0.387365
6.38724
0.452128
39.125296
false
true
2024-11-08
2024-11-12
1
Qwen/Qwen2.5-Coder-14B (Merge)
Qwen_Qwen2.5-Coder-14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-14B-Instruct
1a62978099f9b19f72fdd191988ff958abb18561
32.122834
apache-2.0
95
14.77
true
false
false
true
2.766428
0.690756
69.075608
0.61403
44.220018
0.324773
32.477341
0.30453
7.270694
0.391458
7.032292
0.393949
32.661052
false
true
2024-11-06
2024-11-12
1
Qwen/Qwen2.5-Coder-14B-Instruct (Merge)
Qwen_Qwen2.5-Coder-32B_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-32B
2e12b5f7bc878d424d222e224ed40aee564ec45f
33.262363
apache-2.0
109
32.764
true
false
false
false
9.380501
0.436341
43.634113
0.640396
48.511213
0.308912
30.891239
0.346477
12.863535
0.452813
15.868229
0.530253
47.805851
false
true
2024-11-08
2024-12-10
1
Qwen/Qwen2.5-Coder-32B (Merge)
Qwen_Qwen2.5-Coder-32B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-32B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-32B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-32B-Instruct
b47205940b83b5b484577359f71ee7b88472df67
39.885472
apache-2.0
1,724
32.764
true
false
false
false
9.388779
0.726527
72.652673
0.662522
52.266515
0.495468
49.546828
0.348993
13.199105
0.438583
13.722917
0.441323
37.924793
false
true
2024-11-06
2024-12-10
1
Qwen/Qwen2.5-Coder-32B-Instruct (Merge)
Qwen_Qwen2.5-Coder-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-7B
097b213c52760d22753af1aa5cbdba94b5c99506
19.209491
apache-2.0
97
7.616
true
false
false
true
4.603534
0.344592
34.459235
0.485564
28.438944
0.191843
19.18429
0.259228
1.230425
0.344854
2.173438
0.367936
29.770612
false
true
2024-09-16
2024-09-21
1
Qwen/Qwen2.5-Coder-7B (Merge)
Qwen_Qwen2.5-Coder-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-7B-Instruct
3030861ab8e72c6155e1821631bf977ef40d3e5b
28.052321
apache-2.0
444
7.616
true
false
false
true
2.466292
0.610148
61.014774
0.500798
28.938504
0.371601
37.160121
0.291946
5.592841
0.407271
9.475521
0.335189
26.132166
false
true
2024-09-17
2024-11-07
1
Qwen/Qwen2.5-Coder-7B-Instruct (Merge)
Qwen_Qwen2.5-Coder-7B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Coder-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Coder-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Coder-7B-Instruct
f784f10a7b2aac91bd26e6dbe7dccce691cd4ac5
22.524516
apache-2.0
444
7.616
true
false
false
true
0.697647
0.614719
61.471895
0.499905
28.726578
0.030967
3.096677
0.293624
5.816555
0.409938
9.875521
0.335439
26.15987
false
true
2024-09-17
2024-11-07
1
Qwen/Qwen2.5-Coder-7B-Instruct (Merge)
Qwen_Qwen2.5-Math-1.5B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Math-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Math-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Math-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Math-1.5B-Instruct
aafeb0fc6f22cbf0eaeed126eff8be45b0360a35
12.02487
apache-2.0
35
1.544
true
false
false
true
0.514377
0.185573
18.557317
0.375154
12.859775
0.26284
26.283988
0.265101
2.013423
0.368542
3.534375
0.180103
8.90034
false
true
2024-09-16
2025-03-06
2
Qwen/Qwen2.5-1.5B
Qwen_Qwen2.5-Math-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Math-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Math-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Math-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Math-72B-Instruct
3743c8fd46b002d105c1d28d180f1e531df1d40f
36.822864
other
27
72.706
true
false
false
true
43.056242
0.400347
40.034664
0.645227
48.966096
0.623867
62.386707
0.331376
10.850112
0.447271
16.342187
0.481217
42.357417
false
true
2024-09-16
2024-09-29
2
Qwen/Qwen2.5-72B
Qwen_Qwen2.5-Math-7B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Math-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Math-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Math-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Math-7B
8daf1d676c3f24ddec5a99c5cff00a5c0e1c441c
17.836657
apache-2.0
76
7.616
true
false
false
true
2.679459
0.245998
24.59984
0.445464
22.008761
0.305136
30.513595
0.293624
5.816555
0.378094
4.995052
0.271775
19.086141
false
true
2024-09-16
2024-09-21
1
Qwen/Qwen2.5-7B
Qwen_Qwen2.5-Math-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-Math-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-Math-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2.5-Math-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2.5-Math-7B-Instruct
b3b4c5794bf4b68c1978bb3525afc5e0d0d6fcc4
21.768146
apache-2.0
65
7
true
false
false
true
2.278533
0.263584
26.358396
0.438763
21.489766
0.580816
58.081571
0.261745
1.565996
0.364729
2.891146
0.281998
20.222001
false
true
2024-09-19
2024-09-19
2
Qwen/Qwen2.5-7B
RDson_WomboCombo-R1-Coder-14B-Preview_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/RDson/WomboCombo-R1-Coder-14B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RDson/WomboCombo-R1-Coder-14B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RDson__WomboCombo-R1-Coder-14B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RDson/WomboCombo-R1-Coder-14B-Preview
199b45ea53952f969d8ff5517fb06c561cab39ee
41.456116
3
14.77
false
false
false
true
3.996106
0.628558
62.855778
0.63921
48.154142
0.598943
59.89426
0.321309
9.50783
0.484385
22.014844
0.516789
46.30984
false
false
2025-01-24
2025-02-06
1
RDson/WomboCombo-R1-Coder-14B-Preview (Merge)
RESMPDEV_EVA-Qwen2.5-1.5B-FRFR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/RESMPDEV/EVA-Qwen2.5-1.5B-FRFR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RESMPDEV/EVA-Qwen2.5-1.5B-FRFR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RESMPDEV__EVA-Qwen2.5-1.5B-FRFR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RESMPDEV/EVA-Qwen2.5-1.5B-FRFR
0e8da55d1655e132ee5fa341239cff31bd4e695f
14.185077
apache-2.0
0
1.544
true
false
false
true
1.822264
0.308172
30.817232
0.393241
15.629561
0.102719
10.271903
0.279362
3.914989
0.353938
4.808854
0.277011
19.667923
false
false
2024-12-31
2025-01-14
0
RESMPDEV/EVA-Qwen2.5-1.5B-FRFR
RESMPDEV_Qwen2-Wukong-0.5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/RESMPDEV/Qwen2-Wukong-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RESMPDEV/Qwen2-Wukong-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RESMPDEV__Qwen2-Wukong-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RESMPDEV/Qwen2-Wukong-0.5B
52c58a4aa3d0b44c363c5761fa658243f5c53943
4.97554
apache-2.0
6
0.63
true
false
false
true
1.974538
0.185424
18.542357
0.308451
4.196663
0.001511
0.151057
0.236577
0
0.352479
3.326562
0.132729
3.636599
false
false
2024-06-29
2024-06-30
0
RESMPDEV/Qwen2-Wukong-0.5B
RLHFlow_ArmoRM-Llama3-8B-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForRewardModelWithGating
<a target="_blank" href="https://huggingface.co/RLHFlow/ArmoRM-Llama3-8B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RLHFlow/ArmoRM-Llama3-8B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RLHFlow__ArmoRM-Llama3-8B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RLHFlow/ArmoRM-Llama3-8B-v0.1
eb2676d20da2f2d41082289d23c59b9f7427f955
4.705487
llama3
169
7.511
true
false
false
true
1.847046
0.18967
18.967008
0.287647
1.749448
0
0
0.249161
0
0.394802
6.65026
0.107796
0.866209
false
false
2024-05-23
2024-10-08
0
RLHFlow/ArmoRM-Llama3-8B-v0.1
RLHFlow_LLaMA3-iterative-DPO-final_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/RLHFlow/LLaMA3-iterative-DPO-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RLHFlow/LLaMA3-iterative-DPO-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RLHFlow__LLaMA3-iterative-DPO-final-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RLHFlow/LLaMA3-iterative-DPO-final
40b73bd07a019795837f80579fe95470484ca82b
21.109153
llama3
40
8.03
true
false
false
true
2.985886
0.534011
53.401087
0.505826
29.78776
0.088369
8.836858
0.283557
4.474273
0.367271
5.075521
0.325715
25.079418
false
false
2024-05-17
2024-06-26
0
RLHFlow/LLaMA3-iterative-DPO-final
RWKV_rwkv-raven-14b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
RwkvForCausalLM
<a target="_blank" href="https://huggingface.co/RWKV/rwkv-raven-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RWKV/rwkv-raven-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RWKV__rwkv-raven-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RWKV/rwkv-raven-14b
359c0649b4f1d10a26ebea32908035bc00d152ee
3.960585
57
14
false
false
false
false
3.181258
0.076837
7.683724
0.330704
6.763765
0.004532
0.453172
0.229027
0
0.395146
7.193229
0.115027
1.669622
false
false
2023-05-05
2024-07-08
0
RWKV/rwkv-raven-14b
Rakuten_RakutenAI-2.0-mini-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Rakuten/RakutenAI-2.0-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Rakuten/RakutenAI-2.0-mini-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Rakuten__RakutenAI-2.0-mini-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Rakuten/RakutenAI-2.0-mini-instruct
6d902489587d324b7d5e201299e4e1a169f3a40b
13.318017
apache-2.0
16
1.535
true
false
false
true
0.299416
0.679391
67.939068
0.28672
2.429693
0.052115
5.21148
0.266779
2.237136
0.324917
0.78125
0.111785
1.309471
false
false
2025-02-04
2025-02-26
1
Rakuten/RakutenAI-2.0-mini-instruct (Merge)
Rakuten_RakutenAI-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Rakuten/RakutenAI-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Rakuten/RakutenAI-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Rakuten__RakutenAI-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Rakuten/RakutenAI-7B
c687b10cbf1aa6c34868904b62ecfcef2e0946bf
11.546978
apache-2.0
47
7.373
true
false
false
false
1.283697
0.155597
15.559715
0.431491
20.982052
0.019637
1.963746
0.28943
5.257271
0.373813
4.659896
0.287733
20.85919
false
false
2024-03-18
2024-09-06
1
Rakuten/RakutenAI-7B (Merge)
Rakuten_RakutenAI-7B-chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Rakuten/RakutenAI-7B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Rakuten/RakutenAI-7B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Rakuten__RakutenAI-7B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Rakuten/RakutenAI-7B-chat
1685492c5c40f8a7f57e2cc1c8fa65e5b0c94d31
12.803095
apache-2.0
61
7.373
true
false
false
false
1.248139
0.268555
26.855521
0.43162
20.237552
0.029456
2.945619
0.256711
0.894855
0.378958
5.903125
0.279837
19.9819
false
false
2024-03-18
2024-09-08
1
Rakuten/RakutenAI-7B-chat (Merge)
Replete-AI_L3-Pneuma-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/L3-Pneuma-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/L3-Pneuma-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__L3-Pneuma-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/L3-Pneuma-8B
3e477fa150bf31b360891d3920ccbd57dac110ab
16.691759
llama3
2
8.03
true
false
false
false
5.0536
0.241327
24.132746
0.490868
28.163142
0.054381
5.438066
0.317953
9.060403
0.410521
9.181771
0.31757
24.174424
false
false
2024-10-11
2024-10-15
1
meta-llama/Meta-Llama-3-8B
Replete-AI_L3.1-Pneuma-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/L3.1-Pneuma-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/L3.1-Pneuma-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__L3.1-Pneuma-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/L3.1-Pneuma-8B
843163ca811525c4f98f817aae8fb5da7c1fb7bb
27.684764
llama3.1
2
8.03
true
false
false
true
1.411702
0.707642
70.764239
0.50499
30.26263
0.219789
21.978852
0.302852
7.04698
0.387115
6.15599
0.369099
29.899897
false
false
2024-11-09
2024-11-13
2
meta-llama/Meta-Llama-3.1-8B
Replete-AI_Llama3-8B-Instruct-Replete-Adapted_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Llama3-8B-Instruct-Replete-Adapted" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Llama3-8B-Instruct-Replete-Adapted</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Llama3-8B-Instruct-Replete-Adapted-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Llama3-8B-Instruct-Replete-Adapted
d930f2111913da6fb7693187e1cdc817191c8e5e
22.778518
0
8
false
false
false
true
1.520756
0.691531
69.153069
0.487026
26.888964
0.070997
7.099698
0.28104
4.138702
0.363396
2.824479
0.339096
26.566194
false
false
2024-07-09
0
Removed
Replete-AI_Replete-Coder-Instruct-8b-Merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-Coder-Instruct-8b-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-Coder-Instruct-8b-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-Coder-Instruct-8b-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-Coder-Instruct-8b-Merged
0594615bf84f0803a078b59f14eb090cec2004f3
16.427668
0
8.03
false
false
false
true
1.82907
0.538757
53.875716
0.446169
21.937707
0.077795
7.779456
0.269295
2.572707
0.366031
3.453906
0.180519
8.946513
false
false
2024-07-11
0
Removed
Replete-AI_Replete-Coder-Llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-Coder-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-Coder-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-Coder-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-Coder-Llama3-8B
2aca75c53e7eb2f523889ab1a279e349b8f1b0e8
11.957974
0
8.03
false
false
false
true
2.831614
0.472936
47.293625
0.327128
7.055476
0.047583
4.758308
0.260906
1.454139
0.395302
7.51276
0.133062
3.673537
false
false
2024-06-26
0
Removed
Replete-AI_Replete-Coder-Qwen2-1.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-Coder-Qwen2-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-Coder-Qwen2-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-Coder-Qwen2-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-Coder-Qwen2-1.5b
86fcccbf921b7eb8a4d348e4a3cde0beb63d6626
11.561044
0
1.544
false
false
false
true
3.423742
0.301428
30.142799
0.347473
10.426516
0.03852
3.851964
0.268456
2.46085
0.407271
9.742188
0.214678
12.741947
false
false
2024-06-26
0
Removed
Replete-AI_Replete-LLM-Qwen2-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-LLM-Qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-LLM-Qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-LLM-Qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-LLM-Qwen2-7b
e3569433b23fde853683ad61f342d2c1bd01d60a
3.325394
0
7.616
false
false
false
true
1.227224
0.090475
9.047549
0.298526
2.842933
0
0
0.253356
0.447427
0.38476
5.861719
0.115775
1.752733
false
false
2024-08-13
0
Removed
Replete-AI_Replete-LLM-Qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-LLM-Qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-LLM-Qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-LLM-Qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-LLM-Qwen2-7b
5b75b6180b45d83124e04a00766dc19d2ad52622
3.509167
0
7.616
false
false
false
true
1.856007
0.093248
9.324814
0.297692
2.72497
0
0
0.247483
0
0.394094
7.261719
0.115691
1.743499
false
false
2024-08-13
0
Removed
Replete-AI_Replete-LLM-Qwen2-7b_Beta-Preview_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-LLM-Qwen2-7b_Beta-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-LLM-Qwen2-7b_Beta-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-LLM-Qwen2-7b_Beta-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-LLM-Qwen2-7b_Beta-Preview
fe4c3fc2314db69083527ddd0c9a658fcbc54f15
3.577432
0
7.616
false
false
false
true
0.714137
0.085755
8.575469
0.292932
1.965677
0
0
0.248322
0
0.398063
7.757813
0.128491
3.165632
false
false
2024-07-26
0
Removed
Replete-AI_Replete-LLM-V2-Llama-3.1-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Replete-AI/Replete-LLM-V2-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-LLM-V2-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-LLM-V2-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Replete-AI/Replete-LLM-V2-Llama-3.1-8b
5ff5224804dcc31f536e491e52310f2e3cdc0b57
24.979162
0
8.03
false
false
false
false
1.817362
0.551497
55.14967
0.53392
33.207572
0.140483
14.048338
0.313758
8.501119
0.400073
8.375781
0.375332
30.592494
false
false
2024-08-30
0
Removed
RezVortex_JAJUKA-WEWILLNEVERFORGETYOU-3B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/RezVortex/JAJUKA-WEWILLNEVERFORGETYOU-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RezVortex/JAJUKA-WEWILLNEVERFORGETYOU-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RezVortex__JAJUKA-WEWILLNEVERFORGETYOU-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RezVortex/JAJUKA-WEWILLNEVERFORGETYOU-3B
4bcb8eb1fa4da9385b267573c74592bd65455465
23.215082
0
3.213
false
false
false
true
0.591184
0.68581
68.581032
0.461891
23.760824
0.154834
15.483384
0.25755
1.006711
0.363021
6.644271
0.314328
23.814273
false
false
2025-02-26
0
Removed
RezVortex_Jajuka-3b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/RezVortex/Jajuka-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RezVortex/Jajuka-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RezVortex__Jajuka-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RezVortex/Jajuka-3b
c5f920214d6fc9dd610c014176d3ea7259eb6540
23.475391
0
3.213
false
false
false
true
0.581958
0.692505
69.250478
0.459387
23.204983
0.159366
15.936556
0.26594
2.12528
0.367083
6.585417
0.313747
23.749631
false
false
2025-02-26
0
Removed
Ro-xe_FMixIA-7B-DARE-0_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ro-xe/FMixIA-7B-DARE-0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ro-xe/FMixIA-7B-DARE-0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ro-xe__FMixIA-7B-DARE-0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ro-xe/FMixIA-7B-DARE-0
5436487303217ed564ddb9951de5263803bd9069
18.934625
apache-2.0
0
7.242
true
false
false
false
0.922862
0.334126
33.412568
0.503533
30.438332
0.05287
5.287009
0.28943
5.257271
0.45449
16.811198
0.301612
22.401374
true
false
2024-12-13
2024-12-13
0
Ro-xe/FMixIA-7B-DARE-0
Ro-xe_FMixIA-7B-SLERP-27_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ro-xe/FMixIA-7B-SLERP-27" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ro-xe/FMixIA-7B-SLERP-27</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ro-xe__FMixIA-7B-SLERP-27-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ro-xe/FMixIA-7B-SLERP-27
30406563f691a74558c1ac468df6c61e8a570191
19.827315
apache-2.0
0
7.242
true
false
false
false
0.939634
0.376541
37.654091
0.515059
31.938224
0.063444
6.344411
0.295302
6.040268
0.441156
14.677865
0.300781
22.309028
true
false
2024-12-12
2024-12-12
0
Ro-xe/FMixIA-7B-SLERP-27
Ro-xe_FMixIA-7B-TIES-1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ro-xe/FMixIA-7B-TIES-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ro-xe/FMixIA-7B-TIES-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ro-xe__FMixIA-7B-TIES-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ro-xe/FMixIA-7B-TIES-1
c0d2ed9c652cf4f6ba990ce1d849e1b1fbd1ebc0
19.54119
apache-2.0
0
7.242
true
false
false
false
0.98159
0.345292
34.52916
0.509154
31.127726
0.056647
5.664653
0.288591
5.145414
0.468906
18.646615
0.299202
22.13357
true
false
2024-12-13
2024-12-13
0
Ro-xe/FMixIA-7B-TIES-1
Ro-xe_FMixIA-FrankenMerge-9.5B-PT-9_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ro-xe__FMixIA-FrankenMerge-9.5B-PT-9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9
c17ec4748e234236ee91dcdd609092e813ed4c83
16.186404
apache-2.0
1
14.141
true
false
false
false
4.393437
0.194016
19.401632
0.508785
30.713041
0.003021
0.302115
0.307886
7.718121
0.417031
9.46224
0.365691
29.521277
true
false
2024-12-13
2024-12-13
0
Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9
Rombo-Org_Rombo-LLM-V2.5-Qwen-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Rombo-Org/Rombo-LLM-V2.5-Qwen-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Rombo-Org/Rombo-LLM-V2.5-Qwen-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Rombo-Org__Rombo-LLM-V2.5-Qwen-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Rombo-Org/Rombo-LLM-V2.5-Qwen-7b
efb0a8bf8aff0c6d1748b25bcf40e9a1d62f2496
35.259585
apache-2.0
2
7.616
true
false
false
true
1.372272
0.748184
74.818371
0.539975
34.907254
0.506798
50.679758
0.301174
6.823266
0.398031
7.853906
0.428275
36.474956
false
false
2025-01-11
2025-01-13
1
Rombo-Org/Rombo-LLM-V2.5-Qwen-7b (Merge)
RubielLabarta_LogoS-7Bx2-MoE-13B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2
fb0f72b9914a81892bfeea5a04fcd9676c883d64
20.117389
apache-2.0
10
12.879
true
true
false
false
1.675064
0.43789
43.789035
0.520696
32.794802
0.057402
5.740181
0.277685
3.691275
0.422615
11.49349
0.30876
23.195553
true
false
2024-01-21
2024-08-05
1
RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2 (Merge)
SaisExperiments_Evil-Alpaca-3B-L3.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SaisExperiments/Evil-Alpaca-3B-L3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SaisExperiments/Evil-Alpaca-3B-L3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SaisExperiments__Evil-Alpaca-3B-L3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SaisExperiments/Evil-Alpaca-3B-L3.2
77d25b9182270a66ac60a91d646b447e1530f70e
15.188053
4
3.213
false
false
false
false
1.466299
0.325108
32.510849
0.434076
20.851948
0.070242
7.024169
0.263423
1.789709
0.41976
10.936719
0.262134
18.014923
false
false
2024-09-28
2024-09-28
1
SaisExperiments/Evil-Alpaca-3B-L3.2 (Merge)
SaisExperiments_Gemma-2-2B-Opus-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/SaisExperiments/Gemma-2-2B-Opus-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SaisExperiments/Gemma-2-2B-Opus-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SaisExperiments__Gemma-2-2B-Opus-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SaisExperiments/Gemma-2-2B-Opus-Instruct
7caa9e833d3f5713cf1b8ebd8beeb6ef02da99ea
17.245991
gemma
2
2.614
true
false
false
false
2.315188
0.47496
47.495977
0.429285
19.529533
0.050604
5.060423
0.283557
4.474273
0.405688
8.577604
0.265043
18.338135
false
false
2024-09-03
2024-10-07
2
google/gemma-2-2b
SaisExperiments_Gemma-2-2B-Stheno-Filtered_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/SaisExperiments/Gemma-2-2B-Stheno-Filtered" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SaisExperiments/Gemma-2-2B-Stheno-Filtered</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SaisExperiments__Gemma-2-2B-Stheno-Filtered-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SaisExperiments/Gemma-2-2B-Stheno-Filtered
683443cfa90c7a06978d1c5e9ead0fb0a68b49ca
15.491103
gemma
1
2.614
true
false
false
false
1.776734
0.419655
41.96554
0.414923
17.478867
0.046073
4.607251
0.270134
2.684564
0.400292
8.103125
0.262965
18.10727
false
false
2024-09-04
2024-10-08
2
google/gemma-2-2b
SaisExperiments_Not-So-Small-Alpaca-24B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/SaisExperiments/Not-So-Small-Alpaca-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SaisExperiments/Not-So-Small-Alpaca-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SaisExperiments__Not-So-Small-Alpaca-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SaisExperiments/Not-So-Small-Alpaca-24B
2aa9ee03da6022fe5b98679f4e4d385a9722a21f
28.383065
apache-2.0
0
23.572
true
false
false
true
2.641718
0.624361
62.436114
0.533864
33.018605
0.182779
18.277946
0.35906
14.541387
0.428167
12.0875
0.369432
29.936835
false
false
2025-02-07
2025-02-07
1
SaisExperiments/Not-So-Small-Alpaca-24B (Merge)
SaisExperiments_QwOwO-7B-V1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SaisExperiments/QwOwO-7B-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SaisExperiments/QwOwO-7B-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SaisExperiments__QwOwO-7B-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SaisExperiments/QwOwO-7B-V1
85d13b4879282566bb280bef4a89a235c09cbb13
27.075777
apache-2.0
1
7.616
true
false
false
true
0.621156
0.455626
45.562552
0.543123
35.132501
0.385952
38.595166
0.260067
1.342282
0.38349
6.002865
0.422374
35.819297
false
false
2025-02-26
2025-02-26
1
SaisExperiments/QwOwO-7B-V1 (Merge)
SaisExperiments_RightSheep-Llama3.2-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SaisExperiments/RightSheep-Llama3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SaisExperiments/RightSheep-Llama3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SaisExperiments__RightSheep-Llama3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SaisExperiments/RightSheep-Llama3.2-3B
6e18a57049705ef813cb7fa426d58a0309cd7a29
15.857394
llama3.2
1
3.213
true
false
false
false
1.153879
0.415634
41.563385
0.424078
18.643298
0.080816
8.081571
0.286913
4.9217
0.376729
4.824479
0.253989
17.109929
false
false
2025-01-25
2025-01-25
1
SaisExperiments/RightSheep-Llama3.2-3B (Merge)
Sakalti_Anemoi-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Anemoi-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Anemoi-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Anemoi-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Anemoi-3B
002ae5bf1d1741d5c245d34c71b069ec4917d27c
22.619869
other
1
3.397
true
false
false
false
1.47908
0.380363
38.036299
0.492195
29.05396
0.177492
17.749245
0.305369
7.38255
0.437062
12.766146
0.376579
30.731014
true
false
2025-01-14
2025-01-14
1
Sakalti/Anemoi-3B (Merge)
Sakalti_Euphrates-14B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Euphrates-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Euphrates-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Euphrates-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Euphrates-14B
fb4de44f697b9dee8918853b3e8187deac0aeecd
30.662908
2
14.77
false
false
false
false
3.024972
0.264683
26.468326
0.613769
44.608582
0.305136
30.513595
0.393456
19.127517
0.451573
15.979948
0.525515
47.279477
false
false
2025-01-15
2025-01-15
1
Sakalti/Euphrates-14B (Merge)
Sakalti_Llama3.2-3B-Uranus-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Llama3.2-3B-Uranus-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Llama3.2-3B-Uranus-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Llama3.2-3B-Uranus-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Llama3.2-3B-Uranus-1
05468598f7b5c5122bf30bab9595b70bb0e2fe18
21.030483
llama3.2
0
3.213
true
false
false
false
1.169465
0.533537
53.353657
0.443683
21.416407
0.149547
14.954683
0.29698
6.263982
0.366865
6.92474
0.309425
23.26943
false
false
2025-01-09
2025-01-09
2
AXCXEPT/EZO-Llama-3.2-3B-Instruct-dpoE (Merge)
Sakalti_Magro-7B-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Magro-7B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Magro-7B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Magro-7B-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Magro-7B-v1.1
e35ed139261f2fd0fbec874f30ba331ca49f53cb
12.141141
mit
0
7.242
true
false
false
false
0.863846
0.120402
12.040165
0.417906
19.410132
0.024924
2.492447
0.296141
6.152125
0.443323
13.148698
0.27643
19.60328
true
false
2024-12-12
2024-12-12
1
Sakalti/Magro-7B-v1.1 (Merge)
Sakalti_Neptuno-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Neptuno-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Neptuno-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Neptuno-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Neptuno-3B
bf4ab8d1bbbc95d0a0b7668c152de76eb95cbccf
23.466342
other
1
3.397
true
false
false
false
1.488095
0.429622
42.962229
0.483358
27.4828
0.255287
25.528701
0.296141
6.152125
0.400198
7.858073
0.377327
30.814125
true
false
2025-01-08
2025-01-08
1
Sakalti/Neptuno-3B (Merge)
Sakalti_Neptuno-Alpha_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Neptuno-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Neptuno-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Neptuno-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Neptuno-Alpha
30e36db1c63038b075427661cf71bfab26df3580
22.739135
other
1
3.397
true
false
false
false
1.410485
0.377965
37.796491
0.492477
29.029619
0.183535
18.353474
0.307047
7.606264
0.437062
12.899479
0.376745
30.749483
true
false
2025-01-09
2025-01-09
1
Sakalti/Neptuno-Alpha (Merge)
Sakalti_Oxyge1-33B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Oxyge1-33B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Oxyge1-33B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Oxyge1-33B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Oxyge1-33B
41be8eb03be9a00da6e87f18fb369ad317078e01
41.609246
apache-2.0
2
32.764
true
false
false
false
11.323784
0.454827
45.482653
0.703328
58.032297
0.496224
49.622356
0.38255
17.673378
0.500781
24.297656
0.590924
54.547134
true
false
2024-12-08
2024-12-18
1
Sakalti/Oxyge1-33B (Merge)
Sakalti_Phi3.5-Comets-3.8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Phi3.5-Comets-3.8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Phi3.5-Comets-3.8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Phi3.5-Comets-3.8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Phi3.5-Comets-3.8B
b839027ddc5437b739792aea4dfacc0e91898bb9
5.760043
mit
0
3.821
true
false
false
false
1.027345
0.209429
20.942876
0.333512
6.53359
0.000755
0.075529
0.249161
0
0.376354
5.310938
0.115276
1.697326
true
false
2024-12-21
2024-12-21
1
Sakalti/Phi3.5-Comets-3.8B (Merge)
Sakalti_Qwen2.5-1B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/Qwen2.5-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/Qwen2.5-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__Qwen2.5-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/Qwen2.5-1B-Instruct
6f150c987cbaa2eb5782772e791a1e04cf932bde
4.275488
apache-2.0
0
0.988
true
false
false
false
1.91039
0.175132
17.513198
0.302715
3.43704
0.006042
0.60423
0.255872
0.782998
0.336885
0.94401
0.121343
2.371454
true
false
2025-01-17
2025-01-18
1
Sakalti/Qwen2.5-1B-Instruct (Merge)
Sakalti_QwenTest-7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/QwenTest-7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/QwenTest-7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__QwenTest-7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/QwenTest-7
7bff974fb76d3b95dc9cb14271cfb948b479bc00
4.255209
apache-2.0
0
0.988
true
false
false
false
1.878241
0.167189
16.718862
0.306321
3.632709
0.003776
0.377644
0.260067
1.342282
0.342188
1.106771
0.121177
2.352985
true
false
2025-01-19
2025-01-19
1
Sakalti/QwenTest-7 (Merge)
Sakalti_SJT-0.5B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/SJT-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/SJT-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__SJT-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/SJT-0.5B
0a531361ffe857405505c3399a4dabd780c6c1e1
8.576016
apache-2.0
0
0.63
true
false
false
false
1.015028
0.242477
24.247663
0.330554
8.409744
0.052115
5.21148
0.271812
2.908277
0.319583
0.78125
0.189079
9.89768
true
false
2025-01-12
2025-01-17
1
Sakalti/SJT-0.5B (Merge)
Sakalti_SJT-1.5B-Alpha_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sakalti/SJT-1.5B-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sakalti/SJT-1.5B-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sakalti__SJT-1.5B-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sakalti/SJT-1.5B-Alpha
36eb5b63215952c6522728c3d734ff5e0693a7ac
16.911765
apache-2.0
0
1.777
true
false
false
false
1.109604
0.344867
34.486717
0.424082
18.535867
0.099698
9.969789
0.291946
5.592841
0.422615
11.09349
0.296127
21.791888
true
false
2025-01-18
2025-01-19
1
Sakalti/SJT-1.5B-Alpha (Merge)