eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
BAAI_Infinity-Instruct-3M-0625-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Mistral-7B
302e3ae0bcc50dae3fb69fc1b08b518398e8c407
22.843425
apache-2.0
3
7.242
true
false
false
true
1.571594
0.586742
58.674207
0.493967
28.823289
0.076284
7.628399
0.286913
4.9217
0.42724
12.238281
0.322972
24.774675
false
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Mistral-7B
BAAI_Infinity-Instruct-3M-0625-Qwen2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Qwen2-7B
503c24156d7682458686a7b5324f7f886e63470d
26.199808
apache-2.0
8
7.616
true
false
false
true
2.660156
0.555393
55.539302
0.534591
34.656829
0.192598
19.259819
0.312919
8.389262
0.38876
6.461719
0.396027
32.891918
false
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Qwen2-7B
BAAI_Infinity-Instruct-3M-0625-Yi-1.5-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B
a42c86c61b98ca4fdf238d688fe6ea11cf414d29
28.14496
apache-2.0
3
8.829
true
false
false
true
2.233602
0.518598
51.859843
0.550912
35.378707
0.163897
16.389728
0.354027
13.870246
0.457531
16.72474
0.411818
34.646498
false
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B
BAAI_Infinity-Instruct-7M-0729-Llama3_1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B
0aca33fd7500a781d041e8bf7e5e3789b03f54f4
23.447424
llama3.1
8
8.03
true
false
false
true
1.73361
0.613195
61.319521
0.507734
30.888805
0.127644
12.76435
0.292785
5.704698
0.357844
5.297135
0.32239
24.710033
false
false
2024-08-02
2024-08-05
0
BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B
BAAI_Infinity-Instruct-7M-0729-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-0729-mistral-7B
36651591cb13346ecbde23832013e024029700fa
23.216449
apache-2.0
5
7.242
true
false
false
true
1.598522
0.616193
61.619281
0.496381
28.697915
0.083082
8.308157
0.290268
5.369128
0.406188
10.040104
0.327377
25.264111
false
false
2024-07-25
2024-08-05
0
BAAI/Infinity-Instruct-7M-0729-mistral-7B
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B
1ef63c4993a8c723c9695c827295c17080a64435
37.484454
llama3.1
19
70.554
true
false
false
true
22.138243
0.733546
73.354588
0.66952
52.498947
0.252266
25.226586
0.375839
16.778523
0.453906
16.971615
0.460688
40.076463
false
false
2024-07-25
2024-09-26
0
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B
56f9c2845ae024eb8b1dd9ea0d8891cbaf33c596
23.447424
llama3.1
8
8.03
true
false
false
true
1.83428
0.613195
61.319521
0.507734
30.888805
0.127644
12.76435
0.292785
5.704698
0.357844
5.297135
0.32239
24.710033
false
false
2024-08-02
2024-08-29
0
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B
BAAI_Infinity-Instruct-7M-Gen-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-mistral-7B
82c83d670a8954f4250547b53a057dea1fbd460d
23.191054
apache-2.0
5
7.242
true
false
false
true
1.64927
0.614669
61.466908
0.496381
28.697915
0.083082
8.308157
0.290268
5.369128
0.406188
10.040104
0.327377
25.264111
false
false
2024-07-25
2024-08-29
0
BAAI/Infinity-Instruct-7M-Gen-mistral-7B
BAAI_OPI-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/OPI-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/OPI-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__OPI-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/OPI-Llama-3.1-8B-Instruct
48504799d009b4e1b29e6d2948a7cde68acdc3b0
8.531604
llama3.1
2
8.03
true
false
false
true
1.343314
0.207455
20.745511
0.355122
9.768712
0.013595
1.359517
0.274329
3.243848
0.323302
3.579427
0.212434
12.492612
false
false
2024-09-06
2024-09-21
2
meta-llama/Meta-Llama-3.1-8B
BEE-spoke-data_Meta-Llama-3-8Bee_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/Meta-Llama-3-8Bee" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/Meta-Llama-3-8Bee</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__Meta-Llama-3-8Bee-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/Meta-Llama-3-8Bee
8143e34e77a49a30ec2617c5c9cc22cb3cda2287
14.657812
llama3
0
8.03
true
false
false
false
1.66076
0.195066
19.506576
0.462636
24.199033
0.048338
4.833837
0.313758
8.501119
0.365406
6.242448
0.321975
24.663859
false
false
2024-04-28
2024-07-04
1
meta-llama/Meta-Llama-3-8B
BEE-spoke-data_smol_llama-101M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-101M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-101M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-101M-GQA
bb26643db413bada7e0c3c50752bf9da82403dba
4.0196
apache-2.0
28
0.101
true
false
false
false
0.239211
0.138437
13.843712
0.301756
3.198004
0.006042
0.60423
0.25755
1.006711
0.371271
4.275521
0.110705
1.189421
false
false
2023-10-26
2024-07-06
0
BEE-spoke-data/smol_llama-101M-GQA
BEE-spoke-data_smol_llama-220M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA
8845b1d3c0bc73522ef2700aab467183cbdca9f7
6.577801
apache-2.0
12
0.218
true
false
false
false
0.327227
0.238605
23.860468
0.303167
3.037843
0.010574
1.057402
0.255872
0.782998
0.405875
9.067708
0.114943
1.660387
false
false
2023-12-22
2024-06-26
0
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-GQA-fineweb_edu_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-fineweb_edu-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu
dec16b41d5e94070dbc1f8449a554373fd4cc1d1
6.629851
apache-2.0
1
0.218
true
false
false
false
0.323752
0.198812
19.881248
0.292905
2.314902
0.006798
0.679758
0.259228
1.230425
0.43676
14.261719
0.112699
1.411052
false
false
2024-06-08
2024-06-26
1
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-openhermes_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-openhermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-openhermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-openhermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-openhermes
fb4bcd4b7eee363baacb4176a26cea2aaeb173f4
4.938005
apache-2.0
5
0.218
true
false
false
false
0.308852
0.155523
15.55229
0.302752
3.107692
0.010574
1.057402
0.267617
2.348993
0.384729
6.224479
0.112035
1.337175
false
false
2023-12-30
2024-09-21
1
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_tFINE-900m-e16-d32-flan_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-flan" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-flan</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-flan-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-flan
d9ffec9798402d13d8f2c56ec3de3ad092445297
4.597533
apache-2.0
0
0.887
true
false
false
false
4.912013
0.150577
15.057714
0.302804
4.411894
0.009819
0.981873
0.233221
0
0.372417
3.71875
0.130735
3.414967
false
false
2024-09-06
2024-09-13
1
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024
b1e2f12f5224be9f7da0cb5ff30e1bbb3f10f6ca
5.999886
apache-2.0
0
0.887
true
false
false
false
5.201216
0.132067
13.206736
0.313779
4.737018
0.010574
1.057402
0.254195
0.559284
0.439271
13.808854
0.12367
2.630024
false
false
2024-09-10
2024-09-14
2
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-e16-d32-instruct_2e_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-instruct_2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e
4c626138c9f4e0c3eafe74b2755eb89334c7ca59
5.908138
apache-2.0
0
0.887
true
false
false
false
5.033237
0.140286
14.028555
0.313457
5.01307
0.013595
1.359517
0.259228
1.230425
0.420698
11.18724
0.12367
2.630024
false
false
2024-09-17
2024-09-22
3
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-instruct-orpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-instruct-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-instruct-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-instruct-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-instruct-orpo
e0a21c79bac74442252d36e2c01403afa3f0971b
3.696308
apache-2.0
0
0.887
true
false
false
true
5.149924
0.132992
13.299157
0.302209
3.267301
0.015861
1.586103
0.259228
1.230425
0.340854
1.106771
0.115193
1.688091
false
false
2024-09-22
2024-09-23
0
BEE-spoke-data/tFINE-900m-instruct-orpo
BSC-LT_salamandra-7b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BSC-LT/salamandra-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BSC-LT/salamandra-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BSC-LT__salamandra-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BSC-LT/salamandra-7b
bf30739316ceac4b624583a27ec96dfc401179e8
5.704911
apache-2.0
27
7.768
true
false
false
false
0.378577
0.136738
13.67383
0.351661
10.157422
0.003776
0.377644
0.270134
2.684564
0.350094
1.861719
0.149269
5.474291
false
false
2024-09-30
2024-11-22
0
BSC-LT/salamandra-7b
BSC-LT_salamandra-7b-instruct_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BSC-LT/salamandra-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BSC-LT/salamandra-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BSC-LT__salamandra-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BSC-LT/salamandra-7b-instruct
77ddccbc7d9f9ffd55a8535365e8eebc493ccb8e
10.181244
apache-2.0
54
7.768
true
false
false
true
2.295008
0.245074
24.507418
0.385132
14.688129
0.008308
0.830816
0.264262
1.901566
0.413437
10.213021
0.180519
8.946513
false
false
2024-09-30
2024-11-22
1
BSC-LT/salamandra-7b-instruct (Merge)
Ba2han_Llama-Phi-3_DoRA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ba2han/Llama-Phi-3_DoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ba2han/Llama-Phi-3_DoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ba2han__Llama-Phi-3_DoRA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ba2han/Llama-Phi-3_DoRA
36f99064a7be8ba475c2ee5c5424e95c263ccb87
25.469895
mit
6
3.821
true
false
false
true
1.066273
0.513053
51.305314
0.551456
37.249164
0.121601
12.160121
0.326342
10.178971
0.406927
9.532552
0.391539
32.393248
false
false
2024-05-15
2024-06-26
0
Ba2han/Llama-Phi-3_DoRA
Baptiste-HUVELLE-10_LeTriomphant2.2_ECE_iLAB_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Baptiste-HUVELLE-10/LeTriomphant2.2_ECE_iLAB" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Baptiste-HUVELLE-10/LeTriomphant2.2_ECE_iLAB</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Baptiste-HUVELLE-10__LeTriomphant2.2_ECE_iLAB-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Baptiste-HUVELLE-10/LeTriomphant2.2_ECE_iLAB
d6e0221f0a3fc019049cc411fddc2bf2f646519e
41.304268
apache-2.0
0
72.706
true
false
false
false
33.662663
0.507633
50.763308
0.725632
61.612325
0.444864
44.486405
0.399329
19.910515
0.462552
17.152344
0.585106
53.900709
false
false
2025-01-24
2025-02-28
0
Baptiste-HUVELLE-10/LeTriomphant2.2_ECE_iLAB
BenevolenceMessiah_Qwen2.5-72B-2x-Instruct-TIES-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Qwen2.5-72B-2x-Instruct-TIES-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0
459891ec78c9bbed2836a8bba706e1707db10231
42.26732
1
72.7
false
false
false
true
34.701784
0.54735
54.734992
0.727311
61.911495
0.57855
57.854985
0.36745
15.659955
0.420667
12.016667
0.562832
51.425827
false
false
2024-11-11
2024-11-24
1
BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0 (Merge)
BenevolenceMessiah_Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0
d90f6e36584dc9b367461701e83c833bdeb736f2
15.071092
apache-2.0
0
28.309
true
true
false
false
6.669594
0.301153
30.115316
0.490867
26.877991
0.041541
4.154079
0.262584
1.677852
0.407979
8.930729
0.268035
18.670582
true
false
2024-09-21
2024-09-22
1
BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0 (Merge)
BlackBeenie_Bloslain-8B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Bloslain-8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Bloslain-8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Bloslain-8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Bloslain-8B-v0.2
ebcb7f9f30bc172523a827d1ddefeb52b1aba494
23.803914
1
8.03
false
false
false
false
1.383526
0.502337
50.233713
0.511088
30.662902
0.145015
14.501511
0.306208
7.494407
0.407573
10.446615
0.365359
29.484338
false
false
2024-11-19
2024-11-19
1
BlackBeenie/Bloslain-8B-v0.2 (Merge)
BlackBeenie_Llama-3.1-8B-OpenO1-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-OpenO1-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1
35e7781b9dff5aea29576709201d641e5f44440d
21.40041
apache-2.0
1
8.03
true
false
false
true
1.462856
0.512404
51.240376
0.478745
26.03429
0.152568
15.256798
0.268456
2.46085
0.361813
5.726563
0.349152
27.683585
false
false
2024-12-28
2024-12-29
1
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 (Merge)
BlackBeenie_Llama-3.1-8B-pythonic-passthrough-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-pythonic-passthrough-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge
3ec46616f5b34821b3b928938931295f92e49213
7.399579
0
20.245
false
false
false
false
7.166581
0.231586
23.158553
0.345385
9.359905
0.011329
1.132931
0.268456
2.46085
0.377812
4.593229
0.133228
3.692007
false
false
2024-11-06
2024-11-06
1
BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge (Merge)
BlackBeenie_Neos-Gemma-2-9b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Gemma-2-9b
56dbbb4f972be887e5b57311a8a32e148e98d154
25.475663
apache-2.0
1
9.242
true
false
false
true
5.358184
0.587567
58.756655
0.550298
35.638851
0.098187
9.818731
0.322987
9.731544
0.36175
5.785417
0.398105
33.122784
false
false
2024-11-11
2024-11-11
1
BlackBeenie/Neos-Gemma-2-9b (Merge)
BlackBeenie_Neos-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Llama-3.1-8B
9b48520ec1a777be0f1fd88f95454d85ac568407
19.512177
apache-2.0
1
8.03
true
false
false
true
1.587734
0.494394
49.439376
0.4425
21.080123
0.132175
13.217523
0.268456
2.46085
0.37499
5.740365
0.326213
25.134826
false
false
2024-11-12
2024-11-12
1
BlackBeenie/Neos-Llama-3.1-8B (Merge)
BlackBeenie_Neos-Llama-3.1-base_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Llama-3.1-base
d4af4d73ba5fea0275fd1e3ba5102a79ac8009db
3.968795
0
4.65
false
false
false
true
2.818569
0.175082
17.508212
0.293034
2.221447
0
0
0.237416
0
0.349906
2.838281
0.111203
1.244829
false
false
2024-11-11
2024-11-11
0
BlackBeenie/Neos-Llama-3.1-base
BlackBeenie_Neos-Phi-3-14B-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Phi-3-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Phi-3-14B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Phi-3-14B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Phi-3-14B-v0.1
0afb7cc74a94f11f2695dc92788cdc6e28325f9c
27.032307
apache-2.0
0
13.96
true
false
false
true
1.819252
0.402245
40.224493
0.621193
46.631387
0.178248
17.824773
0.305369
7.38255
0.412542
10.534375
0.456366
39.596262
false
false
2024-11-27
2024-11-27
1
BlackBeenie/Neos-Phi-3-14B-v0.1 (Merge)
BlackBeenie_llama-3-luminous-merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3-luminous-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3-luminous-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3-luminous-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3-luminous-merged
64288dd8e3305f2dc11d84fe0c653f351b2e8a9d
21.618577
0
8.03
false
false
false
false
1.527707
0.432345
43.234507
0.515392
30.643687
0.086858
8.685801
0.292785
5.704698
0.414896
10.628646
0.377327
30.814125
false
false
2024-09-15
2024-10-11
1
BlackBeenie/llama-3-luminous-merged (Merge)
BlackBeenie_llama-3.1-8B-Galore-openassistant-guanaco_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3.1-8B-Galore-openassistant-guanaco-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
828fa03c10e9085700b7abbe26f95067fab010fd
18.374215
1
8.03
false
false
false
false
1.71364
0.263484
26.348422
0.521337
31.444705
0.066465
6.646526
0.300336
6.711409
0.440625
14.578125
0.320645
24.516105
false
false
2024-10-16
2024-10-19
0
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
Bllossom_llama-3.2-Korean-Bllossom-AICA-5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Bllossom/llama-3.2-Korean-Bllossom-AICA-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Bllossom/llama-3.2-Korean-Bllossom-AICA-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Bllossom__llama-3.2-Korean-Bllossom-AICA-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Bllossom/llama-3.2-Korean-Bllossom-AICA-5B
4672b7de38c2cc390b146d6b6ce7a6dd295d8a0e
19.012852
llama3.2
70
5.199
true
false
false
true
1.220236
0.51725
51.724979
0.429307
18.650223
0.123867
12.386707
0.298658
6.487696
0.383396
5.824479
0.271027
19.003029
false
false
2024-12-12
2024-12-16
0
Bllossom/llama-3.2-Korean-Bllossom-AICA-5B
BoltMonkey_DreadMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/DreadMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/DreadMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__DreadMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/DreadMix
ab5dbaaff606538db73b6fd89aa169760104a566
28.761732
1
8.03
false
false
false
true
2.419428
0.709491
70.949082
0.54351
34.845015
0.155589
15.558912
0.299497
6.599553
0.421219
13.61901
0.378989
30.998818
false
false
2024-10-12
2024-10-13
1
BoltMonkey/DreadMix (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
27.776634
llama3.1
2
8.03
true
false
false
true
2.486203
0.799891
79.989096
0.515199
30.7599
0.119335
11.933535
0.28104
4.138702
0.401875
9.467708
0.373338
30.370863
true
false
2024-10-01
2024-10-10
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
21.345511
llama3.1
2
8.03
true
false
false
false
0.774319
0.459023
45.902317
0.518544
30.793785
0.093656
9.365559
0.274329
3.243848
0.40826
9.532552
0.363115
29.235003
true
false
2024-10-01
2024-10-01
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_SuperNeuralDreadDevil-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/SuperNeuralDreadDevil-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/SuperNeuralDreadDevil-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__SuperNeuralDreadDevil-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/SuperNeuralDreadDevil-8b
804d5864127e603abec179a159b43f446246fafc
27.111055
1
8.03
false
false
false
true
3.281416
0.77099
77.098986
0.52862
32.612158
0.0929
9.29003
0.291946
5.592841
0.397687
8.310938
0.367852
29.761377
false
false
2024-10-13
2024-10-13
1
BoltMonkey/SuperNeuralDreadDevil-8b (Merge)
BrainWave-ML_llama3.2-3B-maths-orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BrainWave-ML/llama3.2-3B-maths-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BrainWave-ML/llama3.2-3B-maths-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BrainWave-ML__llama3.2-3B-maths-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BrainWave-ML/llama3.2-3B-maths-orpo
d149d83d8e8f3883421d800848fec85766181923
5.076083
apache-2.0
2
3
true
false
false
false
1.414438
0.204907
20.490742
0.291178
2.347041
0
0
0.259228
1.230425
0.357531
4.52474
0.116772
1.863549
false
false
2024-10-24
2024-10-24
2
meta-llama/Llama-3.2-3B-Instruct
BramVanroy_GEITje-7B-ultra_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/GEITje-7B-ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/GEITje-7B-ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__GEITje-7B-ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/GEITje-7B-ultra
d4552cdc6f015754646464d8411aa4f6bcdba8e8
11.022899
cc-by-nc-4.0
43
7.242
true
false
false
true
1.239046
0.372344
37.234427
0.377616
12.879913
0.015861
1.586103
0.262584
1.677852
0.328979
1.522396
0.20113
11.236702
false
false
2024-01-27
2024-10-28
3
mistralai/Mistral-7B-v0.1
BramVanroy_fietje-2_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2
3abe75d01094b713368e3d911ffb78a2d66ead22
9.1403
mit
9
2.78
true
false
false
false
0.625077
0.209803
20.980332
0.403567
15.603676
0.015861
1.586103
0.254195
0.559284
0.369563
5.161979
0.198554
10.950428
false
false
2024-04-09
2024-10-28
1
microsoft/phi-2
BramVanroy_fietje-2-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-chat
364e785d90438b787b94e33741a930c9932353c0
10.615455
mit
5
2.775
true
false
false
true
0.798065
0.291736
29.173593
0.414975
17.718966
0.018882
1.888218
0.239933
0
0.35276
3.195052
0.205452
11.716903
false
false
2024-04-29
2024-10-28
3
microsoft/phi-2
BramVanroy_fietje-2-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-instruct
b7b44797cd52eda1182667217e8371dbdfee4976
10.485718
mit
3
2.775
true
false
false
true
0.64879
0.278996
27.89964
0.413607
17.57248
0.022659
2.265861
0.233221
0
0.336917
2.914583
0.210356
12.261746
false
false
2024-04-27
2024-10-28
2
microsoft/phi-2
CYFRAGOVPL_Llama-PLLuM-8B-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CYFRAGOVPL/Llama-PLLuM-8B-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CYFRAGOVPL__Llama-PLLuM-8B-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CYFRAGOVPL/Llama-PLLuM-8B-base
d24d927d110786097dba5512d12f9311c49c4ddc
14.518877
llama3.1
0
8.03
true
false
false
false
0.723932
0.289887
28.98875
0.432045
20.218738
0.036254
3.625378
0.285235
4.697987
0.397031
10.06224
0.275682
19.520168
false
false
2025-02-07
2025-02-28
0
CYFRAGOVPL/Llama-PLLuM-8B-base
CYFRAGOVPL_Llama-PLLuM-8B-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CYFRAGOVPL/Llama-PLLuM-8B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CYFRAGOVPL/Llama-PLLuM-8B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CYFRAGOVPL__Llama-PLLuM-8B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CYFRAGOVPL/Llama-PLLuM-8B-chat
b6bb9172444c07e51ad16e8092d0f23e864a8956
14.614817
llama3.1
2
8.03
true
false
false
true
0.624755
0.351486
35.148628
0.407707
16.279057
0.033988
3.398792
0.264262
1.901566
0.419917
11.85625
0.271941
19.10461
false
false
2025-02-07
2025-02-28
0
CYFRAGOVPL/Llama-PLLuM-8B-chat
CYFRAGOVPL_PLLuM-12B-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/CYFRAGOVPL/PLLuM-12B-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CYFRAGOVPL/PLLuM-12B-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CYFRAGOVPL__PLLuM-12B-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CYFRAGOVPL/PLLuM-12B-base
f8770799a0f1f32861a19ae190933ed475fe5488
14.667469
apache-2.0
0
12.248
true
false
false
false
0.893573
0.282094
28.209373
0.43906
21.240802
0.028701
2.870091
0.290268
5.369128
0.41424
10.979948
0.274019
19.335476
false
false
2025-02-07
2025-02-28
0
CYFRAGOVPL/PLLuM-12B-base
CYFRAGOVPL_PLLuM-12B-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/CYFRAGOVPL/PLLuM-12B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CYFRAGOVPL/PLLuM-12B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CYFRAGOVPL__PLLuM-12B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CYFRAGOVPL/PLLuM-12B-chat
3a0b90db5f90490eb05e757355b711b59fca7b95
15.348387
apache-2.0
2
12.248
true
false
false
true
0.780701
0.321436
32.143601
0.44458
21.319738
0.018127
1.812689
0.260067
1.342282
0.411479
14.668229
0.287234
20.803783
false
false
2025-02-07
2025-02-28
0
CYFRAGOVPL/PLLuM-12B-chat
CYFRAGOVPL_PLLuM-12B-nc-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CYFRAGOVPL/PLLuM-12B-nc-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CYFRAGOVPL__PLLuM-12B-nc-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CYFRAGOVPL/PLLuM-12B-nc-base
28b2cf24ee46567c55939ec12edc1bf17d445adb
11.421222
cc-by-nc-4.0
0
12.248
true
false
false
false
0.926213
0.240453
24.045311
0.427676
19.387665
0.021903
2.190332
0.270134
2.684564
0.36451
2.897135
0.255901
17.322326
false
false
2025-02-07
2025-02-28
0
CYFRAGOVPL/PLLuM-12B-nc-base
CYFRAGOVPL_PLLuM-12B-nc-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/CYFRAGOVPL/PLLuM-12B-nc-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CYFRAGOVPL/PLLuM-12B-nc-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CYFRAGOVPL__PLLuM-12B-nc-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CYFRAGOVPL/PLLuM-12B-nc-chat
5b29326cb35c193039ea219183f7e606ec7acda9
14.598343
cc-by-nc-4.0
5
12.248
true
false
false
true
0.835072
0.283442
28.344238
0.457643
23.008554
0.012085
1.208459
0.282718
4.362416
0.435354
12.919271
0.259724
17.747119
false
false
2025-02-07
2025-02-28
0
CYFRAGOVPL/PLLuM-12B-nc-chat
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct
5be46c768d800447b82de41fdc9df2f8c43ba3c0
23.509451
llama3.2
8
3.213
true
false
false
true
1.135907
0.719882
71.988213
0.442672
21.49731
0.205438
20.543807
0.270973
2.796421
0.364917
3.98125
0.282247
20.249704
false
false
2024-09-30
2024-12-20
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct (Merge)
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct-2412_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-2412-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412
ac6f1c0b756412163e17cb05d9e2f7ced274dc12
20.301755
llama3.2
4
3.213
true
false
false
false
1.287179
0.478182
47.818233
0.435772
20.17568
0.175982
17.598187
0.292785
5.704698
0.387208
6.801042
0.313414
23.712692
false
false
2024-12-03
2024-12-19
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 (Merge)
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B
b46c066ea8387264858dc3461f382e7b42fd9c48
25.899339
llama3
15
8.03
true
false
false
true
1.97677
0.712263
71.226346
0.526241
32.486278
0.109517
10.951662
0.286913
4.9217
0.368667
5.55
0.37234
30.260047
true
false
2024-06-26
2024-07-02
1
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B (Merge)
CausalLM_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/14B
cc054cf5953252d0709cb3267d1a85246e489e95
17.23558
wtfpl
302
14
true
false
false
false
1.992829
0.278821
27.882131
0.470046
24.780943
0.075529
7.55287
0.302852
7.04698
0.415479
11.468229
0.322141
24.682329
false
true
2023-10-22
2024-06-12
0
CausalLM/14B
CausalLM_34b-beta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/34b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/34b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__34b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/34b-beta
0429951eb30ccdfff3515e711aaa7649a8a7364c
23.297833
gpl-3.0
63
34.389
true
false
false
false
5.853193
0.304325
30.432475
0.5591
36.677226
0.048338
4.833837
0.346477
12.863535
0.374865
6.92474
0.532497
48.055186
false
true
2024-02-06
2024-06-26
0
CausalLM/34b-beta
CausalLM_preview-1-hf_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GlmForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/preview-1-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/preview-1-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__preview-1-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/preview-1-hf
08e1e1ab428a591e74d849ff30bd8766474205bf
16.706753
0
9.543
true
false
false
true
2.557498
0.555893
55.589281
0.361457
10.100941
0.030211
3.021148
0.261745
1.565996
0.342188
1.106771
0.359707
28.856383
false
true
2025-01-26
0
Removed
Changgil_K2S3-14b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-14b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-14b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-14b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-14b-v0.2
b4f0e1eed2640df2b75847ff37e6ebb1be217b6c
15.275785
cc-by-nc-4.0
0
14.352
true
false
false
false
3.249261
0.324284
32.428401
0.461331
24.283947
0.057402
5.740181
0.28104
4.138702
0.39226
6.799219
0.264378
18.264258
false
false
2024-06-17
2024-06-27
0
Changgil/K2S3-14b-v0.2
Changgil_K2S3-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-v0.1
d544e389f091983bb4f11314edb526d81753c919
14.839284
cc-by-nc-4.0
0
14.352
true
false
false
false
2.499765
0.327656
32.765617
0.465549
24.559558
0.046073
4.607251
0.264262
1.901566
0.401406
7.842448
0.256233
17.359264
false
false
2024-04-29
2024-06-27
0
Changgil/K2S3-v0.1
ClaudioItaly_Albacus_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Albacus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Albacus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Albacus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Albacus
a53faf62d0f99b67478ed9d262872c821a3ba83c
20.505574
mit
1
8.987
true
false
false
false
1.507878
0.466742
46.674158
0.511304
31.638865
0.070997
7.099698
0.271812
2.908277
0.413531
10.658073
0.316489
24.054374
true
false
2024-09-08
2024-09-08
1
ClaudioItaly/Albacus (Merge)
ClaudioItaly_Book-Gut12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Book-Gut12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Book-Gut12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Book-Gut12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Book-Gut12B
ae54351faca8170c93bf1de3a51bf16650f5bcf5
23.394098
mit
1
12.248
true
false
false
false
2.904496
0.399847
39.984685
0.541737
34.632193
0.101964
10.196375
0.307047
7.606264
0.463542
18.276042
0.367021
29.669031
true
false
2024-09-12
2024-09-17
1
ClaudioItaly/Book-Gut12B (Merge)
ClaudioItaly_Evolutionstory-7B-v2.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Evolutionstory-7B-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Evolutionstory-7B-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Evolutionstory-7B-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Evolutionstory-7B-v2.2
9f838721d24a5195bed59a5ed8d9af536f7f2459
20.810835
mit
2
7.242
true
false
false
false
1.120464
0.481379
48.137941
0.510804
31.623865
0.070997
7.099698
0.275168
3.355705
0.413531
10.658073
0.315908
23.989731
true
false
2024-08-30
2024-09-01
1
ClaudioItaly/Evolutionstory-7B-v2.2 (Merge)
ClaudioItaly_intelligence-cod-rag-7b-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/intelligence-cod-rag-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/intelligence-cod-rag-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__intelligence-cod-rag-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/intelligence-cod-rag-7b-v3
2b21473c8a086f8d0c54b82c3454bf5499cdde3a
31.836966
mit
0
7.616
true
false
false
true
1.320945
0.689782
68.9782
0.536634
34.776159
0.380665
38.066465
0.272651
3.020134
0.415271
10.675521
0.419548
35.505319
true
false
2024-11-29
2024-12-02
1
ClaudioItaly/intelligence-cod-rag-7b-v3 (Merge)
CohereForAI_aya-23-35B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-35B
31d6fd858f20539a55401c7ad913086f54d9ca2c
24.755408
cc-by-nc-4.0
278
34.981
true
false
false
true
33.970634
0.646193
64.619321
0.539955
34.85836
0.034743
3.47432
0.294463
5.928412
0.43099
13.473698
0.335605
26.178339
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-35B
CohereForAI_aya-23-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-8B
ec151d218a24031eb039d92fb83d10445427efc9
16.010983
cc-by-nc-4.0
410
8.028
true
false
false
true
2.390344
0.469889
46.988878
0.429616
20.203761
0.016616
1.661631
0.284396
4.58613
0.394063
8.424479
0.227809
14.20102
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-8B
CohereForAI_aya-expanse-32b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-32b
08b69cfa4240e2009c80ad304f000b491d1b8c38
29.71851
cc-by-nc-4.0
241
32.296
true
false
false
true
11.03547
0.730174
73.017372
0.564867
38.709611
0.153323
15.332326
0.325503
10.067114
0.387271
6.408854
0.412982
34.775783
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-32b
CohereForAI_aya-expanse-8b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-8b
b9848575c8731981dfcf2e1f3bfbcb917a2e585d
22.406574
cc-by-nc-4.0
352
8.028
true
false
false
true
2.339378
0.635852
63.585176
0.49772
28.523483
0.086103
8.610272
0.302852
7.04698
0.372885
4.410677
0.300366
22.262855
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-8b
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.936071
cc-by-nc-4.0
1,715
103.811
true
false
false
true
57.263063
0.766419
76.641866
0.581542
39.919954
0.08006
8.006042
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
false
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-plus-08-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus-08-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus-08-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-08-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus-08-2024
2d8cf3ab0af78b9e43546486b096f86adf3ba4d0
33.647475
cc-by-nc-4.0
250
103.811
true
false
false
true
44.637753
0.753954
75.395395
0.5996
42.836865
0.123867
12.386707
0.350671
13.422819
0.482948
19.835156
0.442071
38.007905
false
true
2024-08-21
2024-09-19
0
CohereForAI/c4ai-command-r-plus-08-2024
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.929032
cc-by-nc-4.0
1,080
34.981
true
false
false
true
26.790875
0.674819
67.481948
0.540642
34.556659
0.034743
3.47432
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
false
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
CohereForAI_c4ai-command-r7b-12-2024_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Cohere2ForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r7b-12-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r7b-12-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r7b-12-2024
a9650f3bda8b0e00825ee36592e086b4ee621102
31.617529
cc-by-nc-4.0
375
8.028
true
false
false
true
4.909614
0.771315
77.131456
0.550264
36.024564
0.299094
29.909366
0.308725
7.829978
0.41251
10.230469
0.357214
28.579344
false
true
2024-12-11
2024-12-20
0
CohereForAI/c4ai-command-r7b-12-2024
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2.506
false
false
false
true
0.979648
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.262093
0
2.506
false
false
false
true
1.989138
0.310246
31.02457
0.388103
14.243046
0.053625
5.362538
0.253356
0.447427
0.408073
9.109115
0.166473
7.38586
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
090d9f59c3b47ab8dd099ddd278c058aa6d2d529
11.897379
4
2.506
false
false
false
true
1.924136
0.306649
30.664858
0.389584
14.023922
0.069486
6.94864
0.24245
0
0.427917
12.05625
0.169215
7.690603
false
false
2024-06-28
2024-07-13
0
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.60325
0
2.506
false
false
false
true
1.921618
0.369247
36.924693
0.387878
14.117171
0.067976
6.797583
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
Columbia-NLP_LION-LLaMA-3-8b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
3cddd4a6f5939a0a4db1092a0275342b7b9912f3
21.785404
2
8.03
false
false
false
true
1.393698
0.495742
49.574241
0.502848
30.356399
0.117069
11.706949
0.28104
4.138702
0.409719
10.28151
0.321892
24.654625
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
e2cec0d68a67092951e9205dfe634a59f2f4a2dd
19.853208
2
8.03
false
false
false
true
1.437394
0.396799
39.679938
0.502393
30.457173
0.106495
10.649547
0.285235
4.697987
0.40575
9.71875
0.315243
23.915854
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.748862
0
8.03
false
false
false
true
1.507226
0.381712
38.171164
0.508777
30.88426
0.114048
11.404834
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
CombinHorizon_Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES
881729709fbf263b75e0f7341b66b5a880b82d11
41.765081
apache-2.0
2
14.77
true
false
false
true
3.330704
0.823996
82.399589
0.637009
48.19595
0.531722
53.172205
0.324664
9.955257
0.426031
12.653906
0.497922
44.213579
true
false
2024-12-07
2024-12-07
1
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES
52d6f6308eba9c3a0b9116706fbb1ddc448e6101
35.36673
apache-2.0
1
7.616
true
false
false
true
2.091122
0.756402
75.64019
0.540209
34.95407
0.493202
49.320242
0.297819
6.375839
0.403302
8.779427
0.434176
37.130615
true
false
2024-10-29
2024-10-29
1
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_YiSM-blossom5.1-34B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/YiSM-blossom5.1-34B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/YiSM-blossom5.1-34B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__YiSM-blossom5.1-34B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/YiSM-blossom5.1-34B-SLERP
ebd8d6507623008567a0548cd0ff9e28cbd6a656
31.37993
apache-2.0
0
34.389
true
false
false
true
6.141628
0.503311
50.331121
0.620755
46.397613
0.215257
21.52568
0.355705
14.09396
0.441344
14.367969
0.474069
41.563239
true
false
2024-08-27
2024-08-27
1
CombinHorizon/YiSM-blossom5.1-34B-SLERP (Merge)
CombinHorizon_huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES
3284c32f13733d1cd17c723ed754f2c01b65a15c
45.657847
apache-2.0
2
32.764
true
false
false
true
26.000843
0.820624
82.062372
0.692925
56.044782
0.594411
59.441088
0.338926
11.856823
0.420729
12.091146
0.572058
52.450872
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES
d92237b4b4deccb92a72b5209c79978f09fe3f08
41.466211
apache-2.0
2
14.77
true
false
false
true
3.334259
0.817576
81.757625
0.633589
47.767346
0.547583
54.758308
0.314597
8.612975
0.426031
12.453906
0.491024
43.447104
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES
d976a5d6768d54c5e59a88fe63238a055c30c06a
46.763621
apache-2.0
14
32.764
true
false
false
true
7.366635
0.832814
83.28136
0.695517
56.827407
0.585347
58.534743
0.36745
15.659955
0.431396
14.224479
0.568484
52.053783
true
false
2024-12-07
2024-12-20
1
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
ContactDoctor_Bio-Medical-3B-CoT-012025_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-3B-CoT-012025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-3B-CoT-012025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-3B-CoT-012025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-3B-CoT-012025
37e0ac4b64a82964af3b33324629324cbcbf7cda
18.730711
other
10
3.085
true
false
false
false
1.599689
0.360379
36.037935
0.438315
22.263528
0.221299
22.129909
0.30453
7.270694
0.33676
3.195052
0.293384
21.487145
false
false
2025-01-06
2025-01-15
2
Qwen/Qwen2.5-3B
ContactDoctor_Bio-Medical-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-Llama-3-8B
5436cda92c65b0ef520d278d864305c0f429824b
19.917453
other
73
4.015
true
false
false
false
1.235117
0.442237
44.22366
0.486312
26.195811
0.067221
6.722054
0.333893
11.185682
0.351396
1.757812
0.364777
29.419696
false
false
2024-08-09
2024-12-24
1
meta-llama/Meta-Llama-3-8B-Instruct
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.999994
apache-2.0
0
0.494
true
false
false
true
2.355595
0.191519
19.15185
0.286183
2.276484
0.029456
2.945619
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
CoolSpring_Qwen2-0.5B-Abyme-merge2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge2
02c4c601453f7ecbfab5c95bf5afa889350026ba
6.320258
apache-2.0
0
0.63
true
false
false
true
1.219391
0.202185
20.218465
0.299427
3.709041
0.033233
3.323263
0.260067
1.342282
0.368729
3.891146
0.148936
5.437352
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge2 (Merge)
CoolSpring_Qwen2-0.5B-Abyme-merge3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge3
86fed893893cc2a6240f0ea09ce2eeda1a5178cc
6.820196
apache-2.0
1
0.63
true
false
false
true
1.220343
0.238605
23.860468
0.300314
4.301149
0.031722
3.172205
0.264262
1.901566
0.350094
2.128385
0.150017
5.557402
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge3 (Merge)
Corianas_Neural-Mistral-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/Neural-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Neural-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Neural-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Neural-Mistral-7B
cde6f0126310f38b6781cc26cdb9a02416b896b9
18.200439
apache-2.0
0
7.242
true
false
false
true
0.923427
0.548924
54.892352
0.442802
22.431163
0.018882
1.888218
0.283557
4.474273
0.387271
6.208854
0.27377
19.307772
false
false
2024-03-05
2024-12-06
0
Corianas/Neural-Mistral-7B
Corianas_Quokka_2.7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/Corianas/Quokka_2.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Quokka_2.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Quokka_2.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Quokka_2.7b
d9b3274662c2ac6c6058daac90504b5a8ebcac3c
4.99525
apache-2.0
0
2.786
true
false
false
false
0.587383
0.174907
17.490702
0.305547
3.165268
0.008308
0.830816
0.255872
0.782998
0.390833
6.0875
0.114528
1.614214
false
false
2023-03-30
2024-12-05
0
Corianas/Quokka_2.7b
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
13.99547
apache-2.0
2
-1
true
false
false
false
1.64233
0.230012
23.001192
0.445715
21.88856
0.046828
4.682779
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
CortexLM_btlm-7b-base-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CortexLM/btlm-7b-base-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CortexLM/btlm-7b-base-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CortexLM__btlm-7b-base-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CortexLM/btlm-7b-base-v0.2
eda8b4298365a26c8981316e09427c237b11217f
8.920255
mit
1
6.885
true
false
false
false
1.422717
0.148329
14.832866
0.400641
16.193277
0.015106
1.510574
0.253356
0.447427
0.384604
5.542188
0.234957
14.995198
false
false
2024-06-13
2024-06-26
0
CortexLM/btlm-7b-base-v0.2
Cran-May_SCE-2-24B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/SCE-2-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/SCE-2-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__SCE-2-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/SCE-2-24B
6a477b347fa6c0ce76bcaf353ddc282dd1cc75c3
31.95154
0
23.572
false
false
false
true
2.704235
0.586592
58.659246
0.626469
46.325746
0.189577
18.957704
0.337248
11.63311
0.452813
16.001562
0.461187
40.131871
false
false
2025-02-03
2025-02-04
1
Cran-May/SCE-2-24B (Merge)
Cran-May_SCE-3-24B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/SCE-3-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/SCE-3-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__SCE-3-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/SCE-3-24B
bf2b658dd404c423228e7001498bd69c2d147da2
30.62043
0
23.572
false
false
false
true
2.36317
0.546525
54.652544
0.597283
42.278565
0.188066
18.806647
0.346477
12.863535
0.443479
14.601562
0.464678
40.519725
false
false
2025-02-03
2025-02-04
1
Cran-May/SCE-3-24B (Merge)
Cran-May_T.E-8.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/T.E-8.1
5f84709710dcce7cc05fa12473e8bb207fe25849
35.699515
cc-by-nc-sa-4.0
3
7.616
true
false
false
true
2.181266
0.707692
70.769226
0.558175
37.024377
0.445619
44.561934
0.312919
8.389262
0.450521
15.315104
0.443235
38.13719
false
false
2024-09-27
2024-09-29
1
Cran-May/T.E-8.1 (Merge)
Cran-May_merge_model_20250308_2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/merge_model_20250308_2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/merge_model_20250308_2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__merge_model_20250308_2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/merge_model_20250308_2
07c1b6130747a2508a806d3b8df13e221312c713
40.257534
0
14.766
false
false
false
false
2.021195
0.593237
59.323706
0.658531
50.995698
0.438066
43.806647
0.39094
18.791946
0.479354
19.519271
0.541971
49.107934
false
false
2025-03-08
2025-03-08
1
Cran-May/merge_model_20250308_2 (Merge)
Cran-May_merge_model_20250308_3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/merge_model_20250308_3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/merge_model_20250308_3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__merge_model_20250308_3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/merge_model_20250308_3
1062ad627bd1823458a3b1b22f434941dea75f35
33.229599
0
14.766
false
false
false
false
2.02524
0.60178
60.177994
0.627146
46.568545
0.254532
25.453172
0.322148
9.619687
0.432042
13.538542
0.496177
44.019651
false
false
2025-03-08
2025-03-08
1
Cran-May/merge_model_20250308_3 (Merge)
Cran-May_merge_model_20250308_4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/merge_model_20250308_4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/merge_model_20250308_4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__merge_model_20250308_4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/merge_model_20250308_4
5f8c7b1c49d946c3feb028f138add7efabde94e7
37.569694
0
14.766
false
false
false
false
1.978774
0.453952
45.395218
0.666435
52.023707
0.41994
41.993958
0.397651
19.686801
0.468813
17.801563
0.536652
48.516918
false
false
2025-03-08
2025-03-08
1
Cran-May/merge_model_20250308_4 (Merge)
Cran-May_tempmotacilla-cinerea-0308_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/tempmotacilla-cinerea-0308" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/tempmotacilla-cinerea-0308</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__tempmotacilla-cinerea-0308-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/tempmotacilla-cinerea-0308
36ba31b21cfe855ceb4dad313a5cf2b98616038c
43.640555
1
14.766
false
false
false
true
3.642011
0.808484
80.848371
0.655096
50.598949
0.555136
55.513595
0.362416
14.988814
0.420823
12.669531
0.525017
47.224069
false
false
2025-03-08
2025-03-09
1
Cran-May/tempmotacilla-cinerea-0308 (Merge)
CreitinGameplays_Llama-3.1-8B-R1-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CreitinGameplays/Llama-3.1-8B-R1-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CreitinGameplays/Llama-3.1-8B-R1-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CreitinGameplays__Llama-3.1-8B-R1-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CreitinGameplays/Llama-3.1-8B-R1-v0.1
48af6b20168c9e157b33a7a005b6515901c93b0e
10.034842
mit
0
8.03
true
false
false
true
0.745944
0.323485
32.348502
0.305749
3.215981
0.181269
18.126888
0.258389
1.118568
0.362156
2.602865
0.125166
2.796247
false
false
2025-02-19
2025-03-10
1
CreitinGameplays/Llama-3.1-8B-R1-v0.1 (Merge)
CultriX_Qwen2.5-14B-Broca_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Broca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Broca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Broca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Broca
51204ee25a629abfd6d5e77a850b5e7a36c78462
37.924501
0
14.766
false
false
false
false
4.154003
0.560414
56.041415
0.652715
50.034412
0.358006
35.800604
0.386745
18.232662
0.476656
18.948698
0.536403
48.489214
false
false
2024-12-23
0
Removed