eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
56 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.93k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
441 values
Submission Date
stringclasses
185 values
Generation
int64
0
10
Base Model
stringlengths
4
102
BoltMonkey_DreadMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/DreadMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/DreadMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__DreadMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/DreadMix
ab5dbaaff606538db73b6fd89aa169760104a566
28.661027
0
8.03
false
false
false
true
1.614205
0.709491
70.949082
0.54351
34.845015
0.149547
14.954683
0.299497
6.599553
0.421219
13.61901
0.378989
30.998818
false
false
2024-10-12
2024-10-13
1
BoltMonkey/DreadMix (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
27.726282
llama3.1
2
8.03
true
false
false
true
1.640513
0.799891
79.989096
0.515199
30.7599
0.116314
11.63142
0.28104
4.138702
0.401875
9.467708
0.373338
30.370863
true
false
2024-10-01
2024-10-10
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
21.345511
llama3.1
2
8.03
true
false
false
false
0.774319
0.459023
45.902317
0.518544
30.793785
0.093656
9.365559
0.274329
3.243848
0.40826
9.532552
0.363115
29.235003
true
false
2024-10-01
2024-10-01
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_SuperNeuralDreadDevil-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/SuperNeuralDreadDevil-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/SuperNeuralDreadDevil-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__SuperNeuralDreadDevil-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/SuperNeuralDreadDevil-8b
804d5864127e603abec179a159b43f446246fafc
21.847726
1
8.03
false
false
false
true
2.405331
0.485801
48.580101
0.515108
30.606714
0.090634
9.063444
0.285235
4.697987
0.415948
10.426823
0.349402
27.711288
false
false
2024-10-13
2024-10-13
1
BoltMonkey/SuperNeuralDreadDevil-8b (Merge)
BrainWave-ML_llama3.2-3B-maths-orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BrainWave-ML/llama3.2-3B-maths-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BrainWave-ML/llama3.2-3B-maths-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BrainWave-ML__llama3.2-3B-maths-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BrainWave-ML/llama3.2-3B-maths-orpo
d149d83d8e8f3883421d800848fec85766181923
5.076083
apache-2.0
2
3
true
false
false
false
0.707219
0.204907
20.490742
0.291178
2.347041
0
0
0.259228
1.230425
0.357531
4.52474
0.116772
1.863549
false
false
2024-10-24
2024-10-24
2
meta-llama/Llama-3.2-3B-Instruct
BramVanroy_GEITje-7B-ultra_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/GEITje-7B-ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/GEITje-7B-ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__GEITje-7B-ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/GEITje-7B-ultra
d4552cdc6f015754646464d8411aa4f6bcdba8e8
10.909606
cc-by-nc-4.0
37
7.242
true
false
false
true
0.619523
0.372344
37.234427
0.377616
12.879913
0.009063
0.906344
0.262584
1.677852
0.328979
1.522396
0.20113
11.236702
false
false
2024-01-27
2024-10-28
3
mistralai/Mistral-7B-v0.1
BramVanroy_fietje-2_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2
3abe75d01094b713368e3d911ffb78a2d66ead22
9.027007
mit
6
2.78
true
false
false
false
0.312539
0.209803
20.980332
0.403567
15.603676
0.009063
0.906344
0.254195
0.559284
0.369563
5.161979
0.198554
10.950428
false
false
2024-04-09
2024-10-28
1
microsoft/phi-2
BramVanroy_fietje-2-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-chat
364e785d90438b787b94e33741a930c9932353c0
10.388869
mit
1
2.775
true
false
false
true
0.399033
0.291736
29.173593
0.414975
17.718966
0.005287
0.528701
0.239933
0
0.35276
3.195052
0.205452
11.716903
false
false
2024-04-29
2024-10-28
3
microsoft/phi-2
BramVanroy_fietje-2-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-instruct
b7b44797cd52eda1182667217e8371dbdfee4976
10.196192
mit
2
2.775
true
false
false
true
0.324395
0.278996
27.89964
0.413607
17.57248
0.005287
0.528701
0.233221
0
0.336917
2.914583
0.210356
12.261746
false
false
2024-04-27
2024-10-28
2
microsoft/phi-2
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct
5be46c768d800447b82de41fdc9df2f8c43ba3c0
20.500891
llama3.2
8
1.606
true
false
false
true
0.567954
0.719882
71.988213
0.442672
21.49731
0.024924
2.492447
0.270973
2.796421
0.364917
3.98125
0.282247
20.249704
false
false
2024-09-30
2024-12-20
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct (Merge)
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct-2412_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-2412-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412
ac6f1c0b756412163e17cb05d9e2f7ced274dc12
20.238815
llama3.2
0
1.606
true
false
false
false
0.643589
0.478182
47.818233
0.435772
20.17568
0.172205
17.220544
0.292785
5.704698
0.387208
6.801042
0.313414
23.712692
false
false
2024-12-03
2024-12-19
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 (Merge)
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B
b46c066ea8387264858dc3461f382e7b42fd9c48
25.911927
llama3
12
8.03
true
false
false
true
0.988385
0.712263
71.226346
0.526241
32.486278
0.110272
11.02719
0.286913
4.9217
0.368667
5.55
0.37234
30.260047
true
false
2024-06-26
2024-07-02
1
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B (Merge)
CausalLM_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/14B
cc054cf5953252d0709cb3267d1a85246e489e95
16.643939
wtfpl
297
14
true
false
false
false
0.996414
0.278821
27.882131
0.470046
24.780943
0.04003
4.003021
0.302852
7.04698
0.415479
11.468229
0.322141
24.682329
false
true
2023-10-22
2024-06-12
0
CausalLM/14B
CausalLM_34b-beta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/34b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/34b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__34b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/34b-beta
0429951eb30ccdfff3515e711aaa7649a8a7364c
23.285245
gpl-3.0
62
34.389
true
false
false
false
2.926596
0.304325
30.432475
0.5591
36.677226
0.047583
4.758308
0.346477
12.863535
0.374865
6.92474
0.532497
48.055186
false
true
2024-02-06
2024-06-26
0
CausalLM/34b-beta
Changgil_K2S3-14b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-14b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-14b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-14b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-14b-v0.2
b4f0e1eed2640df2b75847ff37e6ebb1be217b6c
15.187668
cc-by-nc-4.0
0
14.352
true
false
false
false
1.62463
0.324284
32.428401
0.461331
24.283947
0.052115
5.21148
0.28104
4.138702
0.39226
6.799219
0.264378
18.264258
false
false
2024-06-17
2024-06-27
0
Changgil/K2S3-14b-v0.2
Changgil_K2S3-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-v0.1
d544e389f091983bb4f11314edb526d81753c919
14.839284
cc-by-nc-4.0
0
14.352
true
false
false
false
1.249882
0.327656
32.765617
0.465549
24.559558
0.046073
4.607251
0.264262
1.901566
0.401406
7.842448
0.256233
17.359264
false
false
2024-04-29
2024-06-27
0
Changgil/K2S3-v0.1
ClaudioItaly_Albacus_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Albacus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Albacus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Albacus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Albacus
a53faf62d0f99b67478ed9d262872c821a3ba83c
20.492986
mit
1
8.987
true
false
false
false
0.753939
0.466742
46.674158
0.511304
31.638865
0.070242
7.024169
0.271812
2.908277
0.413531
10.658073
0.316489
24.054374
true
false
2024-09-08
2024-09-08
1
ClaudioItaly/Albacus (Merge)
ClaudioItaly_Book-Gut12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Book-Gut12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Book-Gut12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Book-Gut12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Book-Gut12B
ae54351faca8170c93bf1de3a51bf16650f5bcf5
23.343746
mit
1
12.248
true
false
false
false
1.452248
0.399847
39.984685
0.541737
34.632193
0.098943
9.89426
0.307047
7.606264
0.463542
18.276042
0.367021
29.669031
true
false
2024-09-12
2024-09-17
1
ClaudioItaly/Book-Gut12B (Merge)
ClaudioItaly_Evolutionstory-7B-v2.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Evolutionstory-7B-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Evolutionstory-7B-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Evolutionstory-7B-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Evolutionstory-7B-v2.2
9f838721d24a5195bed59a5ed8d9af536f7f2459
20.798247
mit
1
7.242
true
false
false
false
0.560232
0.481379
48.137941
0.510804
31.623865
0.070242
7.024169
0.275168
3.355705
0.413531
10.658073
0.315908
23.989731
true
false
2024-08-30
2024-09-01
1
ClaudioItaly/Evolutionstory-7B-v2.2 (Merge)
ClaudioItaly_intelligence-cod-rag-7b-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/intelligence-cod-rag-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/intelligence-cod-rag-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__intelligence-cod-rag-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/intelligence-cod-rag-7b-v3
2b21473c8a086f8d0c54b82c3454bf5499cdde3a
27.129011
mit
0
7.616
true
false
false
true
0.660472
0.689782
68.9782
0.536634
34.776159
0.098187
9.818731
0.272651
3.020134
0.415271
10.675521
0.419548
35.505319
true
false
2024-11-29
2024-12-02
1
ClaudioItaly/intelligence-cod-rag-7b-v3 (Merge)
CohereForAI_aya-23-35B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-35B
31d6fd858f20539a55401c7ad913086f54d9ca2c
24.67988
cc-by-nc-4.0
267
34.981
true
false
false
true
16.985317
0.646193
64.619321
0.539955
34.85836
0.030211
3.021148
0.294463
5.928412
0.43099
13.473698
0.335605
26.178339
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-35B
CohereForAI_aya-23-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-8B
ec151d218a24031eb039d92fb83d10445427efc9
15.998395
cc-by-nc-4.0
396
8.028
true
false
false
true
1.195172
0.469889
46.988878
0.429616
20.203761
0.015861
1.586103
0.284396
4.58613
0.394063
8.424479
0.227809
14.20102
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-8B
CohereForAI_aya-expanse-32b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-32b
08b69cfa4240e2009c80ad304f000b491d1b8c38
29.391219
cc-by-nc-4.0
193
32.296
true
false
false
true
5.517735
0.730174
73.017372
0.564867
38.709611
0.133686
13.36858
0.325503
10.067114
0.387271
6.408854
0.412982
34.775783
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-32b
CohereForAI_aya-expanse-8b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-8b
b9848575c8731981dfcf2e1f3bfbcb917a2e585d
22.142223
cc-by-nc-4.0
307
8.028
true
false
false
true
1.169689
0.635852
63.585176
0.49772
28.523483
0.070242
7.024169
0.302852
7.04698
0.372885
4.410677
0.300366
22.262855
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-8b
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.961247
cc-by-nc-4.0
1,700
103.811
true
false
false
true
28.631532
0.766419
76.641866
0.581542
39.919954
0.081571
8.1571
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
false
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-plus-08-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus-08-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus-08-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-08-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus-08-2024
2d8cf3ab0af78b9e43546486b096f86adf3ba4d0
33.584534
cc-by-nc-4.0
206
103.811
true
false
false
true
22.318877
0.753954
75.395395
0.5996
42.836865
0.120091
12.009063
0.350671
13.422819
0.482948
19.835156
0.442071
38.007905
false
true
2024-08-21
2024-09-19
0
CohereForAI/c4ai-command-r-plus-08-2024
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.349978
cc-by-nc-4.0
1,070
34.981
true
false
false
true
13.395437
0.674819
67.481948
0.540642
34.556659
0
0
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
false
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
CohereForAI_c4ai-command-r7b-12-2024_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Cohere2ForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r7b-12-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r7b-12-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r7b-12-2024
a9650f3bda8b0e00825ee36592e086b4ee621102
31.07624
cc-by-nc-4.0
307
8.028
true
false
false
true
2.454807
0.771315
77.131456
0.550264
36.024564
0.266616
26.661631
0.308725
7.829978
0.41251
10.230469
0.357214
28.579344
false
true
2024-12-11
2024-12-20
0
CohereForAI/c4ai-command-r7b-12-2024
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2.506
false
false
false
true
0.979648
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.1488
0
2.506
false
false
false
true
0.994569
0.310246
31.02457
0.388103
14.243046
0.046828
4.682779
0.253356
0.447427
0.408073
9.109115
0.166473
7.38586
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
090d9f59c3b47ab8dd099ddd278c058aa6d2d529
11.456795
4
2.506
false
false
false
true
0.962068
0.306649
30.664858
0.389584
14.023922
0.043051
4.305136
0.24245
0
0.427917
12.05625
0.169215
7.690603
false
false
2024-06-28
2024-07-13
0
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.489957
0
2.506
false
false
false
true
0.960809
0.369247
36.924693
0.387878
14.117171
0.061178
6.117825
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
Columbia-NLP_LION-LLaMA-3-8b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
3cddd4a6f5939a0a4db1092a0275342b7b9912f3
21.470701
2
8.03
false
false
false
true
0.696849
0.495742
49.574241
0.502848
30.356399
0.098187
9.818731
0.28104
4.138702
0.409719
10.28151
0.321892
24.654625
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
e2cec0d68a67092951e9205dfe634a59f2f4a2dd
19.462976
2
8.03
false
false
false
true
0.718697
0.396799
39.679938
0.502393
30.457173
0.083082
8.308157
0.285235
4.697987
0.40575
9.71875
0.315243
23.915854
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.459336
0
8.03
false
false
false
true
0.753613
0.381712
38.171164
0.508777
30.88426
0.096677
9.667674
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
CombinHorizon_Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES
881729709fbf263b75e0f7341b66b5a880b82d11
32.903047
apache-2.0
2
14.77
true
false
false
true
1.665352
0.823996
82.399589
0.637009
48.19595
0
0
0.324664
9.955257
0.426031
12.653906
0.497922
44.213579
true
false
2024-12-07
2024-12-07
1
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES
52d6f6308eba9c3a0b9116706fbb1ddc448e6101
27.14669
apache-2.0
1
7.616
true
false
false
true
1.045561
0.756402
75.64019
0.540209
34.95407
0
0
0.297819
6.375839
0.403302
8.779427
0.434176
37.130615
true
false
2024-10-29
2024-10-29
1
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_YiSM-blossom5.1-34B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/YiSM-blossom5.1-34B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/YiSM-blossom5.1-34B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__YiSM-blossom5.1-34B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/YiSM-blossom5.1-34B-SLERP
ebd8d6507623008567a0548cd0ff9e28cbd6a656
31.392518
apache-2.0
0
34.389
true
false
false
true
3.070814
0.503311
50.331121
0.620755
46.397613
0.216012
21.601208
0.355705
14.09396
0.441344
14.367969
0.474069
41.563239
true
false
2024-08-27
2024-08-27
1
CombinHorizon/YiSM-blossom5.1-34B-SLERP (Merge)
CombinHorizon_huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES
3284c32f13733d1cd17c723ed754f2c01b65a15c
35.750999
apache-2.0
1
32.764
true
false
false
true
13.000422
0.820624
82.062372
0.692925
56.044782
0
0
0.338926
11.856823
0.420729
12.091146
0.572058
52.450872
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES
d92237b4b4deccb92a72b5209c79978f09fe3f08
32.339826
apache-2.0
2
14.77
true
false
false
true
1.66713
0.817576
81.757625
0.633589
47.767346
0
0
0.314597
8.612975
0.426031
12.453906
0.491024
43.447104
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES
d976a5d6768d54c5e59a88fe63238a055c30c06a
37.007831
apache-2.0
2
16.382
true
false
false
true
3.683318
0.832814
83.28136
0.695517
56.827407
0
0
0.36745
15.659955
0.431396
14.224479
0.568484
52.053783
true
false
2024-12-07
2024-12-20
1
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
ContactDoctor_Bio-Medical-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-Llama-3-8B
5436cda92c65b0ef520d278d864305c0f429824b
20.005569
other
29
4.015
true
false
false
false
0.617558
0.442237
44.22366
0.486312
26.195811
0.072508
7.250755
0.333893
11.185682
0.351396
1.757812
0.364777
29.419696
false
false
2024-08-09
2024-12-24
1
meta-llama/Meta-Llama-3-8B-Instruct
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.798584
apache-2.0
0
0.494
true
false
false
true
1.177797
0.191519
19.15185
0.286183
2.276484
0.017372
1.73716
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
CoolSpring_Qwen2-0.5B-Abyme-merge2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge2
02c4c601453f7ecbfab5c95bf5afa889350026ba
6.118848
apache-2.0
0
0.63
true
false
false
true
0.609695
0.202185
20.218465
0.299427
3.709041
0.021148
2.114804
0.260067
1.342282
0.368729
3.891146
0.148936
5.437352
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge2 (Merge)
CoolSpring_Qwen2-0.5B-Abyme-merge3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge3
86fed893893cc2a6240f0ea09ce2eeda1a5178cc
6.706903
apache-2.0
0
0.63
true
false
false
true
0.610171
0.238605
23.860468
0.300314
4.301149
0.024924
2.492447
0.264262
1.901566
0.350094
2.128385
0.150017
5.557402
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge3 (Merge)
Corianas_Neural-Mistral-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/Neural-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Neural-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Neural-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Neural-Mistral-7B
cde6f0126310f38b6781cc26cdb9a02416b896b9
18.200439
apache-2.0
0
7.242
true
false
false
true
0.461713
0.548924
54.892352
0.442802
22.431163
0.018882
1.888218
0.283557
4.474273
0.387271
6.208854
0.27377
19.307772
false
false
2024-03-05
2024-12-06
0
Corianas/Neural-Mistral-7B
Corianas_Quokka_2.7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/Corianas/Quokka_2.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Quokka_2.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Quokka_2.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Quokka_2.7b
d9b3274662c2ac6c6058daac90504b5a8ebcac3c
4.919721
apache-2.0
0
2.786
true
false
false
false
0.293691
0.174907
17.490702
0.305547
3.165268
0.003776
0.377644
0.255872
0.782998
0.390833
6.0875
0.114528
1.614214
false
false
2023-03-30
2024-12-05
0
Corianas/Quokka_2.7b
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
14.020646
apache-2.0
0
-1
true
false
false
false
0.821165
0.230012
23.001192
0.445715
21.88856
0.048338
4.833837
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
CortexLM_btlm-7b-base-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CortexLM/btlm-7b-base-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CortexLM/btlm-7b-base-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CortexLM__btlm-7b-base-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CortexLM/btlm-7b-base-v0.2
eda8b4298365a26c8981316e09427c237b11217f
8.869902
mit
1
6.885
true
false
false
false
0.711358
0.148329
14.832866
0.400641
16.193277
0.012085
1.208459
0.253356
0.447427
0.384604
5.542188
0.234957
14.995198
false
false
2024-06-13
2024-06-26
0
CortexLM/btlm-7b-base-v0.2
Cran-May_T.E-8.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/T.E-8.1
5f84709710dcce7cc05fa12473e8bb207fe25849
29.405457
cc-by-nc-sa-4.0
3
7.616
true
false
false
true
1.090633
0.707692
70.769226
0.558175
37.024377
0.067976
6.797583
0.312919
8.389262
0.450521
15.315104
0.443235
38.13719
false
false
2024-09-27
2024-09-29
1
Cran-May/T.E-8.1 (Merge)
CultriX_Qwen2.5-14B-Broca_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Broca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Broca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Broca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Broca
51204ee25a629abfd6d5e77a850b5e7a36c78462
37.723091
1
7.383
false
false
false
false
2.077001
0.560414
56.041415
0.652715
50.034412
0.345921
34.592145
0.386745
18.232662
0.476656
18.948698
0.536403
48.489214
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Broca (Merge)
CultriX_Qwen2.5-14B-Brocav3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav3
6f3fe686a79dcbcd5835ca100e194c49f493167b
38.764254
1
7.383
false
false
false
false
1.816739
0.695178
69.517768
0.645235
49.049112
0.322508
32.250755
0.35906
14.541387
0.475635
19.254427
0.531749
47.972074
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav3 (Merge)
CultriX_Qwen2.5-14B-Brocav6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav6
bd981505b6950df69216b260c3c0d86124fded7b
38.317568
0
7.383
false
false
false
false
1.791401
0.699524
69.952393
0.638884
47.819225
0.296073
29.607251
0.36745
15.659955
0.474208
18.876042
0.531915
47.990544
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav6 (Merge)
CultriX_Qwen2.5-14B-Brocav7_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav7
06acee7f6e9796081ced6201001784907c77f96f
38.522214
0
7.383
false
false
false
false
1.701349
0.672372
67.237153
0.644403
48.905361
0.318731
31.873112
0.36745
15.659955
0.479604
20.150521
0.525765
47.307181
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav7 (Merge)
CultriX_Qwen2.5-14B-Emerged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emerged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emerged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emerged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Emerged
8bf0e31b23ee22858bbde2cee44dde88963f5084
37.662617
0
7.383
false
false
false
false
1.80736
0.700024
70.002371
0.626003
45.932419
0.307402
30.740181
0.357383
14.317673
0.469094
18.470052
0.518617
46.513002
false
false
2024-12-19
2024-12-19
1
CultriX/Qwen2.5-14B-Emerged (Merge)
CultriX_Qwen2.5-14B-Emergedv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emergedv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emergedv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emergedv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Emergedv3
f4df1b9c2bf37bbfd6b2e8f2ff244c6029a5d546
34.842092
1
7.383
false
false
false
false
1.918928
0.638849
63.884936
0.619073
44.731608
0.206949
20.694864
0.360738
14.765101
0.472813
18.601563
0.51737
46.374483
false
false
2024-12-21
2024-12-21
1
CultriX/Qwen2.5-14B-Emergedv3 (Merge)
CultriX_Qwen2.5-14B-FinalMerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-FinalMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-FinalMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-FinalMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-FinalMerge
8fd624d0d8989a312d344772814da3575423897a
28.057015
1
7.383
false
false
false
false
1.943941
0.489098
48.909782
0.571495
38.162479
0.130665
13.066465
0.354866
13.982103
0.437906
14.504948
0.457447
39.716312
false
false
2024-12-22
2024-12-23
1
CultriX/Qwen2.5-14B-FinalMerge (Merge)
CultriX_Qwen2.5-14B-MegaMerge-pt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MegaMerge-pt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MegaMerge-pt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MegaMerge-pt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MegaMerge-pt2
20397f6cafc09c2cb74f105867cd99b3c68c71dc
36.694314
apache-2.0
2
14.766
true
false
false
false
2.250434
0.568308
56.830765
0.65777
50.907903
0.273414
27.34139
0.379195
17.225951
0.472875
18.742708
0.542055
49.117169
true
false
2024-10-24
2024-10-25
1
CultriX/Qwen2.5-14B-MegaMerge-pt2 (Merge)
CultriX_Qwen2.5-14B-MergeStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MergeStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MergeStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MergeStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MergeStock
fa00543296f2731793dfb0aac667571ccf1abb5b
36.390259
apache-2.0
2
14.766
true
false
false
false
4.430606
0.568533
56.85326
0.657934
51.009391
0.273414
27.34139
0.373322
16.442953
0.467635
17.854427
0.539561
48.84013
true
false
2024-10-23
2024-10-24
1
CultriX/Qwen2.5-14B-MergeStock (Merge)
CultriX_Qwen2.5-14B-Unity_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Unity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Unity</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Unity-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Unity
1d15e7941e6ceff5d6e4f293378947bee721a24d
33.666802
3
7.383
false
false
false
false
1.913689
0.673895
67.389526
0.601996
42.258617
0.153323
15.332326
0.347315
12.975391
0.467948
18.760156
0.507563
45.284796
false
false
2024-12-21
2024-12-21
1
CultriX/Qwen2.5-14B-Unity (Merge)
CultriX_Qwen2.5-14B-Wernicke_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke
622c0a58ecb0c0c679d7381a823d2ae5ac2b8ce1
36.999242
apache-2.0
4
14.77
true
false
false
false
2.222234
0.52347
52.346995
0.656836
50.642876
0.324773
32.477341
0.393456
19.127517
0.468906
18.246615
0.542387
49.154108
true
false
2024-10-21
2024-10-22
1
CultriX/Qwen2.5-14B-Wernicke (Merge)
CultriX_Qwen2.5-14B-Wernicke-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SFT
3b68dfba2cf79e4a15e8f4271f7d4b62d2ab9f26
33.524336
apache-2.0
2
14.77
true
false
false
true
1.393013
0.493744
49.374438
0.646059
49.330572
0.358006
35.800604
0.354027
13.870246
0.39
7.55
0.506981
45.220154
true
false
2024-11-16
2024-11-17
1
CultriX/Qwen2.5-14B-Wernicke-SFT (Merge)
CultriX_Qwen2.5-14B-Wernicke-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SLERP
180175561e8061be067fc349ad4491270f19976f
30.639825
0
14.491
false
false
false
true
2.155988
0.55889
55.889041
0.644093
49.372327
0.094411
9.441088
0.34396
12.527964
0.414031
11.120573
0.509392
45.487958
false
false
2024-10-25
0
Removed
CultriX_Qwen2.5-14B-Wernickev3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernickev3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernickev3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernickev3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernickev3
bd141b0df78ad1f6e2938edf167c2305b395a2b2
37.940558
1
7.383
false
false
false
false
1.915634
0.70482
70.481988
0.618415
44.576275
0.327795
32.779456
0.362416
14.988814
0.471667
18.691667
0.515126
46.125148
false
false
2024-12-19
2024-12-19
1
CultriX/Qwen2.5-14B-Wernickev3 (Merge)
CultriX_Qwenfinity-2.5-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwenfinity-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwenfinity-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwenfinity-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwenfinity-2.5-14B
6acc1308274031b045f028b0a0290cdbe4243a04
27.96652
0
7.383
false
false
false
false
1.977067
0.481379
48.137941
0.565501
37.259942
0.148792
14.879154
0.348993
13.199105
0.450583
15.45625
0.449801
38.866726
false
false
2024-12-21
2024-12-23
1
CultriX/Qwenfinity-2.5-14B (Merge)
CultriX_Qwestion-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwestion-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwestion-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwestion-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwestion-14B
e286bfafbc28e36859202c9f06ed8287a4f1d8b6
37.630294
apache-2.0
1
14.766
true
false
false
false
1.853821
0.63178
63.178034
0.64501
48.757034
0.317221
31.722054
0.368289
15.771812
0.463604
17.217188
0.542221
49.135638
true
false
2024-11-21
2024-11-23
1
CultriX/Qwestion-14B (Merge)
CultriX_SeQwence-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B
f4a147b717ba0e9392f96e343250b00239196a22
36.659687
apache-2.0
3
14.766
true
false
false
false
1.796383
0.53516
53.516004
0.650567
50.163578
0.339879
33.987915
0.360738
14.765101
0.466615
18.426823
0.541888
49.0987
false
false
2024-11-20
2024-11-20
0
CultriX/SeQwence-14B
CultriX_SeQwence-14B-EvolMerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-EvolMerge
a98c932f0d71d76883fe9aa9d708af0506b01343
37.200413
apache-2.0
2
14.766
true
false
false
false
1.950826
0.538158
53.815764
0.657218
50.780351
0.317976
31.797583
0.380872
17.449664
0.482083
20.260417
0.541888
49.0987
true
false
2024-11-27
2024-11-27
1
CultriX/SeQwence-14B-EvolMerge (Merge)
CultriX_SeQwence-14B-EvolMergev1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMergev1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMergev1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMergev1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-EvolMergev1
6cc7116cdea757635dba52bb82a306654d118e77
36.852183
2
14.766
false
false
false
false
1.957896
0.555468
55.546838
0.654555
50.302259
0.324773
32.477341
0.376678
16.89038
0.462271
17.083854
0.539312
48.812426
false
false
2024-11-25
2024-11-27
1
CultriX/SeQwence-14B-EvolMergev1 (Merge)
CultriX_SeQwence-14B-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-v5
9f43ad41542be56f6a18f31bfa60086318735ed5
37.268663
0
14.766
false
false
false
false
1.86516
0.591988
59.198815
0.651709
49.995731
0.310423
31.042296
0.369966
15.995526
0.471417
18.327083
0.541473
49.052527
false
false
2024-11-18
0
Removed
CultriX_SeQwence-14Bv1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv1
542bfbd2e6fb25ecd11b84d956764eb23233a034
38.197632
apache-2.0
2
14.766
true
false
false
false
1.830191
0.6678
66.780033
0.634467
47.190898
0.335347
33.534743
0.361577
14.876957
0.470427
18.803385
0.531998
47.999778
true
false
2024-11-24
2024-11-27
1
CultriX/SeQwence-14Bv1 (Merge)
CultriX_SeQwence-14Bv2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv2
674c6d49b604fdf26e327e1e86c4fde0724b98e8
34.409763
0
14.766
false
false
false
false
1.974894
0.578599
57.859923
0.630451
46.529224
0.216012
21.601208
0.360738
14.765101
0.460104
17.546354
0.533411
48.156767
false
false
2024-11-27
2024-12-08
1
CultriX/SeQwence-14Bv2 (Merge)
CultriX_SeQwence-14Bv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv3
b3f2b5273bbc996814a25aa9060fd6f4c0d93bca
34.411032
2
14.766
false
false
false
false
1.965075
0.571905
57.190477
0.630225
46.385368
0.221299
22.129909
0.364933
15.324385
0.462427
17.270052
0.533494
48.166002
false
false
2024-11-27
2024-11-27
1
CultriX/SeQwence-14Bv3 (Merge)
DRXD1000_Atlas-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DRXD1000/Atlas-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DRXD1000/Atlas-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DRXD1000__Atlas-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DRXD1000/Atlas-7B
967ee983e2a0b163c12da69f1f81aaf8ffb2a456
8.509639
apache-2.0
0
7
true
false
false
true
1.256758
0.370446
37.044597
0.330218
7.540208
0.002266
0.226586
0.25755
1.006711
0.33425
0.78125
0.140126
4.458481
false
false
2024-12-10
2024-12-10
0
DRXD1000/Atlas-7B
DRXD1000_Phoenix-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DRXD1000/Phoenix-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DRXD1000/Phoenix-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DRXD1000__Phoenix-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DRXD1000/Phoenix-7B
a5caa8036d8b7819eb723debe3f037471b5c4882
12.143216
apache-2.0
17
7
true
false
false
true
0.470872
0.320962
32.096171
0.393157
15.62018
0
0
0.278523
3.803132
0.384948
6.41849
0.234292
14.921321
false
false
2024-01-10
2024-12-11
0
DRXD1000/Phoenix-7B
DUAL-GPO_zephyr-7b-ipo-0k-15k-i1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-0k-15k-i1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DUAL-GPO/zephyr-7b-ipo-0k-15k-i1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DUAL-GPO__zephyr-7b-ipo-0k-15k-i1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DUAL-GPO/zephyr-7b-ipo-0k-15k-i1
564d269c67dfcc5c07a4fbc270a6a48da1929d30
15.492948
0
14.483
false
false
false
false
0.971423
0.275624
27.562423
0.447271
22.658643
0.030211
3.021148
0.291107
5.480984
0.417344
10.567969
0.312999
23.666519
false
false
2024-09-20
2024-09-22
1
DUAL-GPO/zephyr-7b-ipo-qlora-v0-merged
DZgas_GIGABATEMAN-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DZgas/GIGABATEMAN-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DZgas/GIGABATEMAN-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DZgas__GIGABATEMAN-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DZgas/GIGABATEMAN-7B
edf2840350e7fd55895d9df560b489ac10ecb95e
20.446293
5
7.242
false
false
false
false
0.630337
0.460746
46.074638
0.503218
29.827517
0.053625
5.362538
0.28943
5.257271
0.432844
11.972135
0.317653
24.183658
false
false
2024-04-17
2024-09-15
1
DZgas/GIGABATEMAN-7B (Merge)
Dampfinchen_Llama-3.1-8B-Ultra-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dampfinchen/Llama-3.1-8B-Ultra-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dampfinchen/Llama-3.1-8B-Ultra-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dampfinchen__Llama-3.1-8B-Ultra-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dampfinchen/Llama-3.1-8B-Ultra-Instruct
46662d14130cfd34f7d90816540794f24a301f86
29.127051
llama3
7
8.03
true
false
false
true
0.836479
0.808109
80.810915
0.525753
32.494587
0.15861
15.861027
0.291946
5.592841
0.400323
8.607031
0.382563
31.395907
true
false
2024-08-26
2024-08-26
1
Dampfinchen/Llama-3.1-8B-Ultra-Instruct (Merge)
Danielbrdz_Barcenas-14b-Phi-3-medium-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-Phi-3-medium-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
b749dbcb19901b8fd0e9f38c923a24533569f895
31.738448
mit
5
13.96
true
false
false
true
1.572315
0.479906
47.990554
0.653618
51.029418
0.193353
19.335347
0.326342
10.178971
0.48075
20.527083
0.472324
41.369311
false
false
2024-06-15
2024-08-13
0
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
Danielbrdz_Barcenas-Llama3-8b-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-Llama3-8b-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-Llama3-8b-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-Llama3-8b-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-Llama3-8b-ORPO
66c848c4526d3db1ec41468c0f73ac4448c6abe9
26.519005
other
7
8.03
true
false
false
true
0.774159
0.737243
73.724274
0.498656
28.600623
0.06571
6.570997
0.307047
7.606264
0.418958
11.169792
0.382979
31.44208
false
false
2024-04-29
2024-06-29
0
Danielbrdz/Barcenas-Llama3-8b-ORPO
Dans-DiscountModels_Dans-Instruct-CoreCurriculum-12b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-CoreCurriculum-12b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML
56925fafe6a543e224db36864dd0927171542776
12.913452
apache-2.0
0
12.248
true
false
false
false
3.234644
0.211102
21.11021
0.479186
26.046417
0.005287
0.528701
0.280201
4.026846
0.360635
5.71276
0.280502
20.055777
false
false
2024-09-04
2024-09-04
1
mistralai/Mistral-Nemo-Base-2407
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML
029d84d4f4a618aa798490c046753b12801158e2
13.49618
0
8.03
false
false
false
false
0.798569
0.082508
8.250775
0.473817
26.336394
0.053625
5.362538
0.294463
5.928412
0.391823
9.677865
0.32879
25.421099
false
false
2024-09-14
0
Removed
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0
9367c1273b0025793531fcf3a2c15416539f5d81
12.974202
0
8.03
false
false
false
false
0.814699
0.06682
6.682048
0.477477
26.737652
0.061178
6.117825
0.286074
4.809843
0.378583
8.122917
0.328374
25.374926
false
false
2024-09-20
0
Removed
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1
a6188cd1807d0d72e55adc371ddd198d7e9aa7ae
13.311583
0
8.03
false
false
false
false
0.790589
0.091051
9.105063
0.474865
26.412551
0.057402
5.740181
0.291107
5.480984
0.38249
7.811198
0.327876
25.319518
false
false
2024-09-23
0
Removed
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0
15a9988381fdba15281f1bd6b04c34f3f96120cc
18.590919
0
8.03
false
false
false
true
0.843717
0.506409
50.640855
0.462426
24.734771
0.043807
4.380665
0.293624
5.816555
0.364448
3.75599
0.29995
22.216681
false
false
2024-09-30
0
Removed
Dans-DiscountModels_Mistral-7b-v0.3-Test-E0.7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Mistral-7b-v0.3-Test-E0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7
e91ad0ada3f0d906bacd3c0ad41da4f65ce77b08
19.144688
0
7
false
false
false
true
0.437886
0.512354
51.235389
0.475022
26.820762
0.032477
3.247734
0.296141
6.152125
0.40051
8.030469
0.274435
19.381649
false
false
2024-11-15
0
Removed
Dans-DiscountModels_mistral-7b-test-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/mistral-7b-test-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/mistral-7b-test-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__mistral-7b-test-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/mistral-7b-test-merged
9db677cc43fb88852d952ef5914e919e65dd03eb
22.224397
apache-2.0
0
7
true
false
false
true
1.89713
0.6678
66.780033
0.489817
28.941005
0.053625
5.362538
0.294463
5.928412
0.375396
4.357813
0.297789
21.976581
false
false
2024-11-27
2024-11-30
1
Dans-DiscountModels/mistral-7b-test-merged (Merge)
Darkknight535_OpenCrystal-12B-L3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Darkknight535/OpenCrystal-12B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Darkknight535/OpenCrystal-12B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Darkknight535__OpenCrystal-12B-L3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Darkknight535/OpenCrystal-12B-L3
974d2d453afdde40f6a993601bbbbf9d97b43606
20.672888
14
11.52
false
false
false
false
2.012285
0.407091
40.709096
0.52226
31.844491
0.089124
8.912387
0.306208
7.494407
0.365656
5.740365
0.364029
29.336584
false
false
2024-08-25
2024-08-26
0
Darkknight535/OpenCrystal-12B-L3
DavidAU_L3-Dark-Planet-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Dark-Planet-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Dark-Planet-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Dark-Planet-8B
462c9307ba4cfcb0c1edcceac5e06f4007bc803e
20.519537
5
8.03
false
false
false
false
0.939141
0.413411
41.341086
0.508408
29.789627
0.085347
8.534743
0.300336
6.711409
0.361594
6.332552
0.37367
30.407801
false
false
2024-09-05
2024-09-12
1
DavidAU/L3-Dark-Planet-8B (Merge)
DavidAU_L3-Jamet-12.2B-MK.V-Blackroot-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Jamet-12.2B-MK.V-Blackroot-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct
db4ae3d7b608fd0e7490d2fcfa0436e56e21af33
17.857043
0
12.174
false
false
false
false
1.437522
0.3962
39.619986
0.476572
25.869793
0.040785
4.07855
0.278523
3.803132
0.401969
8.31276
0.329122
25.458038
false
false
2024-08-23
2024-09-04
1
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct (Merge)
DavidAU_L3-Lumimaid-12.2B-v0.1-OAS-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Lumimaid-12.2B-v0.1-OAS-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct
65a9e957dc4211aa3d87fdf588767823af5cde3f
17.743439
1
12.174
false
false
false
false
1.424707
0.392403
39.240327
0.469302
24.504816
0.040785
4.07855
0.276846
3.579418
0.419427
11.261719
0.314162
23.795804
false
false
2024-08-24
2024-09-12
1
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct (Merge)
DavidAU_L3-SMB-Instruct-12.2B-F32_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-SMB-Instruct-12.2B-F32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-SMB-Instruct-12.2B-F32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-SMB-Instruct-12.2B-F32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-SMB-Instruct-12.2B-F32
ac5e205a41b17a7b05b1b62f352aacc7e65b2f13
18.863875
1
12.174
false
false
false
false
1.382397
0.430322
43.032155
0.478641
26.130957
0.044562
4.456193
0.281879
4.250559
0.408729
9.624479
0.3312
25.688904
false
false
2024-08-25
2024-09-12
1
DavidAU/L3-SMB-Instruct-12.2B-F32 (Merge)
DavidAU_L3-Stheno-Maid-Blackroot-Grand-HORROR-16B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B
7b626e50b6c35fcb064b8b039fcf30eae01c3fae
17.096786
0
16.537
false
false
false
false
2.922799
0.343893
34.389309
0.473633
26.692021
0.015861
1.586103
0.270973
2.796421
0.403115
8.55599
0.357048
28.560875
false
false
2024-08-23
2024-09-04
1
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B (Merge)
DavidAU_L3-Stheno-v3.2-12.2B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-v3.2-12.2B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-v3.2-12.2B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-v3.2-12.2B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-v3.2-12.2B-Instruct
8271fc32a601a4fa5efbe58c41a0ef4181ad8836
18.790033
1
12.174
false
false
false
false
1.3977
0.402795
40.279459
0.484598
27.369623
0.053625
5.362538
0.275168
3.355705
0.41025
10.314583
0.334525
26.058289
false
false
2024-08-24
2024-09-12
1
DavidAU/L3-Stheno-v3.2-12.2B-Instruct (Merge)
Deci_DeciLM-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B
c3c9f4226801dc0433f32aebffe0aac68ee2f051
14.960537
apache-2.0
225
7.044
true
false
false
false
0.642137
0.281295
28.129474
0.442286
21.25273
0.024924
2.492447
0.295302
6.040268
0.435854
13.048438
0.269199
18.799867
false
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B
Deci_DeciLM-7B-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B-instruct
4adc7aa9efe61b47b0a98b2cc94527d9c45c3b4f
17.457504
apache-2.0
96
7.044
true
false
false
true
0.638649
0.488024
48.8024
0.458975
23.887149
0.029456
2.945619
0.28943
5.257271
0.388417
5.985417
0.260805
17.867169
false
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B-instruct
DeepAutoAI_Explore_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.1-8B-Inst
9752180fafd8f584625eb649c0cba36b91bdc3ce
28.788231
apache-2.0
0
8.03
true
false
false
true
1.750239
0.779483
77.948288
0.511742
30.393263
0.192598
19.259819
0.283557
4.474273
0.390958
9.636458
0.379156
31.017287
false
false
2024-09-21
2024-10-09
1
DeepAutoAI/Explore_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst
9fd790df246b8979c02173f7698819a7805fb04e
13.708555
apache-2.0
0
1.236
true
false
false
true
0.891846
0.564886
56.488561
0.350481
8.292274
0.063444
6.344411
0.255872
0.782998
0.318344
1.359635
0.180851
8.983452
false
false
2024-10-07
2024-10-09
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
9509dee6b01fff1a11dc26cf58d7eecbe3d9d9c4
13.182851
1
1.236
false
false
false
true
0.467189
0.559715
55.971489
0.336509
7.042772
0.049094
4.909366
0.263423
1.789709
0.310313
0.455729
0.180352
8.928044
false
false
2024-10-08
2024-10-08
0
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1
3f8b0fb6dcc1e9725ba52dd086241d5d9e413100
10.619319
apache-2.0
0
1.236
true
false
false
true
0.469966
0.499889
49.988918
0.314148
4.25778
0.01284
1.283988
0.244966
0
0.378094
5.195052
0.126912
2.990174
false
false
2024-10-08
2024-10-08
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 (Merge)