eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
MultivexAI_Gladiator-Mini-Exp-1222-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MultivexAI/Gladiator-Mini-Exp-1222-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MultivexAI/Gladiator-Mini-Exp-1222-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MultivexAI__Gladiator-Mini-Exp-1222-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MultivexAI/Gladiator-Mini-Exp-1222-3B-Instruct
07c170bb7c6b09ddeb4d4fe0fc894b371b27cc50
20.353089
mit
0
3.213
true
false
false
true
1.211649
0.616318
61.631804
0.437318
20.567491
0.141239
14.123867
0.263423
1.789709
0.31276
1.595052
0.301695
22.410609
false
false
2024-12-22
2024-12-22
1
MultivexAI/Gladiator-Mini-Exp-1222-3B-Instruct (Merge)
MultivexAI_Phi-3.5-Mini-Instruct-MultiVex-v0.25-GGUF_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MultivexAI/Phi-3.5-Mini-Instruct-MultiVex-v0.25-GGUF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MultivexAI/Phi-3.5-Mini-Instruct-MultiVex-v0.25-GGUF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MultivexAI__Phi-3.5-Mini-Instruct-MultiVex-v0.25-GGUF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MultivexAI/Phi-3.5-Mini-Instruct-MultiVex-v0.25-GGUF
0bdec12abd74bc164fdfa432528b914e19f6a9aa
3.83477
0
3.821
false
false
false
true
1.012125
0.143982
14.398241
0.290775
1.602381
0.006042
0.60423
0.255034
0.671141
0.364198
4.52474
0.110871
1.20789
false
false
2024-11-13
0
Removed
Mxode_NanoLM-0.3B-Instruct-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-0.3B-Instruct-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-0.3B-Instruct-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-0.3B-Instruct-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-0.3B-Instruct-v1
638cda2c122e96c7992227b56b29967d9c8fd57e
5.737739
gpl-3.0
0
0.315
true
false
false
true
1.212552
0.153674
15.367447
0.302825
3.10461
0.01435
1.435045
0.271812
2.908277
0.415521
10.440104
0.110539
1.170952
false
false
2024-09-03
2024-09-05
0
Mxode/NanoLM-0.3B-Instruct-v1
Mxode_NanoLM-0.3B-Instruct-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-0.3B-Instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-0.3B-Instruct-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-0.3B-Instruct-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-0.3B-Instruct-v1.1
7338464708c691667b193e7bb8f6b5bb3f9df27d
5.974299
gpl-3.0
2
0.315
true
false
false
true
1.21496
0.178279
17.827919
0.30144
3.09528
0.013595
1.359517
0.25
0
0.427333
12.216667
0.112118
1.34641
false
false
2024-09-05
2024-09-05
0
Mxode/NanoLM-0.3B-Instruct-v1.1
Mxode_NanoLM-0.3B-Instruct-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-0.3B-Instruct-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-0.3B-Instruct-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-0.3B-Instruct-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-0.3B-Instruct-v2
40027e2a1a404144975cfc0dd7d354057b98854b
5.013671
gpl-3.0
0
0.315
true
false
false
true
1.213042
0.166789
16.678857
0.29211
2.209481
0.006798
0.679758
0.260906
1.454139
0.395458
7.565625
0.113447
1.494164
false
false
2024-09-07
2024-09-08
0
Mxode/NanoLM-0.3B-Instruct-v2
Mxode_NanoLM-1B-Instruct-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-1B-Instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-1B-Instruct-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-1B-Instruct-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-1B-Instruct-v1.1
cad6274afcfcf33927dc6c116d63013dcc1dfc48
6.756723
gpl-3.0
0
1.076
true
false
false
true
1.65096
0.239529
23.952889
0.31835
6.106919
0.036254
3.625378
0.263423
1.789709
0.343271
2.675521
0.121509
2.389923
false
false
2024-09-07
2024-09-08
0
Mxode/NanoLM-1B-Instruct-v1.1
Mxode_NanoLM-1B-Instruct-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-1B-Instruct-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-1B-Instruct-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-1B-Instruct-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-1B-Instruct-v2
ebd8c374447985dbd4e247ffe6c5ebb5b4910418
7.354415
gpl-3.0
0
1.076
true
false
false
true
1.620563
0.262984
26.298444
0.312315
4.910623
0.041541
4.154079
0.263423
1.789709
0.355208
4.334375
0.123753
2.639258
false
false
2024-09-07
2024-09-09
0
Mxode/NanoLM-1B-Instruct-v2
NAPS-ai_naps-gemma-2-27b-v-0.1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-gemma-2-27b-v-0.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-gemma-2-27b-v-0.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-gemma-2-27b-v-0.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-gemma-2-27b-v-0.1.0
c75cc878c364615db4b1b173b21b97ebcfb13d70
1.679602
apache-2.0
0
27.227
true
false
false
false
22.449722
0
0
0.291178
2.347041
0
0
0.260067
1.342282
0.357531
4.52474
0.116772
1.863549
false
false
2024-11-11
2024-11-11
1
NAPS-ai/naps-gemma-2-27b-v-0.1.0 (Merge)
NAPS-ai_naps-gemma-2-27b-v0.1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-gemma-2-27b-v0.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-gemma-2-27b-v0.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-gemma-2-27b-v0.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-gemma-2-27b-v0.1.0
befb5b776e052a364bad4a5b3380a4d8370572dd
1.679602
apache-2.0
0
27.227
true
false
false
false
34.013435
0
0
0.291178
2.347041
0
0
0.260067
1.342282
0.357531
4.52474
0.116772
1.863549
false
false
2024-11-11
2024-11-11
1
NAPS-ai/naps-gemma-2-27b-v0.1.0 (Merge)
NAPS-ai_naps-llama-3_1-8b-instruct-v0.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-llama-3_1-8b-instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-llama-3_1-8b-instruct-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-llama-3_1-8b-instruct-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-llama-3_1-8b-instruct-v0.3
3dcd36be024e02de712d537f8786d868659127bb
23.278338
apache-2.0
0
8.03
true
false
false
false
1.938616
0.539082
53.908186
0.490053
26.27454
0.190332
19.033233
0.299497
6.599553
0.378708
7.205208
0.339844
26.649306
false
false
2024-09-02
2024-09-30
0
NAPS-ai/naps-llama-3_1-8b-instruct-v0.3
NAPS-ai_naps-llama-3_1-8b-instruct-v0.4_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-llama-3_1-8b-instruct-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-llama-3_1-8b-instruct-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-llama-3_1-8b-instruct-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-llama-3_1-8b-instruct-v0.4
152229e8de5270aea7b9d7689503fb2577f8911a
27.715085
apache-2.0
0
8.03
true
false
false
true
1.783856
0.73442
73.442023
0.486183
27.832819
0.196375
19.637462
0.279362
3.914989
0.442115
13.964323
0.34749
27.498892
false
false
2024-09-12
2024-09-30
1
NAPS-ai/naps-llama-3_1-8b-instruct-v0.4 (Merge)
NAPS-ai_naps-llama-3_1-instruct-v0.5.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-llama-3_1-instruct-v0.5.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-llama-3_1-instruct-v0.5.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-llama-3_1-instruct-v0.5.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-llama-3_1-instruct-v0.5.0
bf6d3578346e80c586ec1a4a9883079523b48c11
16.000823
apache-2.0
0
8.03
true
false
false
true
2.275242
0.502012
50.201244
0.414758
18.110133
0.036254
3.625378
0.268456
2.46085
0.371271
3.675521
0.261386
17.931811
false
false
2024-09-12
2024-09-30
0
NAPS-ai/naps-llama-3_1-instruct-v0.5.0
NAPS-ai_naps-llama-3_1_instruct-v0.6.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-llama-3_1_instruct-v0.6.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-llama-3_1_instruct-v0.6.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-llama-3_1_instruct-v0.6.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-llama-3_1_instruct-v0.6.0
e0ce03ea6539f9398adbe14d8f9512e5484625b4
15.779501
apache-2.0
0
8.03
true
false
false
false
1.575401
0.328006
32.800636
0.452845
22.339534
0.064199
6.41994
0.281879
4.250559
0.373906
3.971615
0.324053
24.894725
false
false
2024-10-01
2024-11-13
1
NAPS-ai/naps-llama-3_1_instruct-v0.6.0 (Merge)
NAPS-ai_naps-llama3.1-70B-v0.2-fp16_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NAPS-ai/naps-llama3.1-70B-v0.2-fp16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NAPS-ai/naps-llama3.1-70B-v0.2-fp16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NAPS-ai__naps-llama3.1-70B-v0.2-fp16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NAPS-ai/naps-llama3.1-70B-v0.2-fp16
98040775c86bb30230a3cfb8477ca546adcd9a66
4.215465
apache-2.0
0
70.761
true
false
false
false
148.772536
0.184499
18.449935
0.304074
3.070261
0
0
0.239094
0
0.348604
2.675521
0.109874
1.097074
false
false
2024-12-10
2024-12-12
1
NAPS-ai/naps-llama3.1-70B-v0.2-fp16 (Merge)
NCSOFT_Llama-VARCO-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NCSOFT/Llama-VARCO-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NCSOFT/Llama-VARCO-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NCSOFT__Llama-VARCO-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NCSOFT/Llama-VARCO-8B-Instruct
fe2d9358a2d35451c04e4589b47e361cfacd350d
20.983509
llama3.1
69
8.03
true
false
false
true
1.190005
0.447033
44.703276
0.502288
29.177057
0.106495
10.649547
0.29698
6.263982
0.384073
10.775781
0.318983
24.331413
false
false
2024-09-12
2024-12-21
1
NCSOFT/Llama-VARCO-8B-Instruct (Merge)
NJS26_NJS_777_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NJS26/NJS_777" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NJS26/NJS_777</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NJS26__NJS_777-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NJS26/NJS_777
6fbf69f7058f449afa1a113e66869953f585bb7f
4.500174
0
10.362
false
false
false
false
0.9694
0.188096
18.809647
0.217821
3.160601
0
0
0.206376
0
0.353781
3.222656
0.116273
1.808141
false
false
2025-02-26
2025-02-27
1
NJS26/NJS_777 (Merge)
NLPark_AnFeng_v3.1-Avocet_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NLPark/AnFeng_v3.1-Avocet" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NLPark/AnFeng_v3.1-Avocet</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NLPark__AnFeng_v3.1-Avocet-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NLPark/AnFeng_v3.1-Avocet
5170739731033323e6e66a0f68d34790042a3b2a
28.390956
cc-by-nc-nd-4.0
0
34.393
true
false
false
false
6.344016
0.509631
50.963111
0.582852
40.309034
0.159366
15.936556
0.324664
9.955257
0.447573
14.979948
0.443816
38.201832
false
false
2024-08-03
2024-08-07
0
NLPark/AnFeng_v3.1-Avocet
NLPark_B-and-W_Flycatcher-3AD1E_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NLPark/B-and-W_Flycatcher-3AD1E" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NLPark/B-and-W_Flycatcher-3AD1E</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NLPark__B-and-W_Flycatcher-3AD1E-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NLPark/B-and-W_Flycatcher-3AD1E
21044e39f6854f5a6df84c5074d449b7eb96b522
30.467333
apache-2.0
0
14.77
true
false
false
true
3.102645
0.490847
49.084651
0.606512
43.742458
0.237915
23.791541
0.330537
10.738255
0.442271
13.883854
0.474069
41.563239
false
false
2024-09-28
2024-09-28
0
NLPark/B-and-W_Flycatcher-3AD1E
NLPark_Shi-Ci-Robin-Test_3AD80_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NLPark/Shi-Ci-Robin-Test_3AD80" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NLPark/Shi-Ci-Robin-Test_3AD80</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NLPark__Shi-Ci-Robin-Test_3AD80-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NLPark/Shi-Ci-Robin-Test_3AD80
995887837a259817570489183cbe8b1abffd23b1
39.234122
llama3.1
0
70.554
true
false
false
true
24.893509
0.722655
72.265478
0.670481
52.265662
0.31571
31.570997
0.359899
14.653244
0.469594
18.865885
0.512051
45.783466
false
false
2024-10-25
2024-10-25
1
NLPark/Shi-Ci-Robin-Test_3AD80 (Merge)
NTQAI_NxMobileLM-1.5B-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NTQAI/NxMobileLM-1.5B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NTQAI/NxMobileLM-1.5B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NTQAI__NxMobileLM-1.5B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NTQAI/NxMobileLM-1.5B-SFT
c5095c4969a48999c99f0e34ba3db929a0b59b8b
18.734829
mit
4
1.544
true
false
false
true
1.678315
0.639224
63.922393
0.395718
16.162543
0.084592
8.459215
0.259228
1.230425
0.355521
2.440104
0.281749
20.194297
false
false
2025-01-15
2025-01-15
1
NTQAI/NxMobileLM-1.5B-SFT (Merge)
NTQAI_Nxcode-CQ-7B-orpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NTQAI/Nxcode-CQ-7B-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NTQAI/Nxcode-CQ-7B-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NTQAI__Nxcode-CQ-7B-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NTQAI/Nxcode-CQ-7B-orpo
74f3b3c06de36b261af9ef857279d6e33f893336
12.37378
other
124
7.25
true
false
false
true
1.68435
0.400721
40.07212
0.414302
17.580005
0.021903
2.190332
0.254195
0.559284
0.393969
7.046094
0.161154
6.794843
false
false
2024-04-24
2024-08-10
0
NTQAI/Nxcode-CQ-7B-orpo
NYTK_PULI-GPTrio_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/NYTK/PULI-GPTrio" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NYTK/PULI-GPTrio</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NYTK__PULI-GPTrio-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NYTK/PULI-GPTrio
16a56dd22d184e4b7b49d90461fa8d4810639463
5.833728
cc-by-nc-4.0
10
7.673
true
false
false
false
1.444094
0.217972
21.797165
0.306003
3.015221
0.012085
1.208459
0.26594
2.12528
0.381875
5.334375
0.113697
1.521868
false
false
2023-06-08
2024-08-24
0
NYTK/PULI-GPTrio
NYTK_PULI-LlumiX-32K_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NYTK/PULI-LlumiX-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NYTK/PULI-LlumiX-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NYTK__PULI-LlumiX-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NYTK/PULI-LlumiX-32K
a589894397a36b61c578d0dd4778ee6e5fe471ff
6.519109
llama2
11
6.738
true
false
false
false
1.645139
0.169961
16.996126
0.318936
5.107047
0.01284
1.283988
0.253356
0.447427
0.396417
7.71875
0.168052
7.561318
false
false
2024-03-12
2024-08-24
0
NYTK/PULI-LlumiX-32K
Naveenpoliasetty_llama3-8B-V2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Naveenpoliasetty/llama3-8B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Naveenpoliasetty/llama3-8B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Naveenpoliasetty__llama3-8B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Naveenpoliasetty/llama3-8B-V2
e0458381d02bc411b9e576796d185f23dcc11f71
20.820687
1
8.03
false
false
false
false
1.504024
0.412262
41.226169
0.518866
30.873209
0.07855
7.854985
0.290268
5.369128
0.408135
9.183594
0.373753
30.417036
false
false
2024-06-18
2024-06-26
1
Naveenpoliasetty/llama3-8B-V2 (Merge)
NbAiLab_nb-llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NbAiLab/nb-llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NbAiLab/nb-llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NbAiLab__nb-llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NbAiLab/nb-llama-3.1-8B-Instruct
e56aaceb823e1b0d29029c8a9e4bc090a07d81c4
8.479797
0
8.03
false
false
false
true
1.495082
0.362503
36.25026
0.324666
5.448859
0.022659
2.265861
0.27349
3.131991
0.32076
1.595052
0.119681
2.186761
false
false
2024-12-10
0
Removed
NbAiLab_nb-llama-3.1-8B-sft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NbAiLab/nb-llama-3.1-8B-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NbAiLab/nb-llama-3.1-8B-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NbAiLab__nb-llama-3.1-8B-sft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NbAiLab/nb-llama-3.1-8B-sft
4afbe8f228a7c10155e6687bd337499726db0604
8.180261
llama3.1
0
8.03
true
false
false
true
1.47617
0.361578
36.157839
0.328151
5.952498
0.021903
2.190332
0.254195
0.559284
0.328729
1.757812
0.122174
2.4638
false
false
2024-11-25
2024-12-11
0
NbAiLab/nb-llama-3.1-8B-sft
Nekochu_Llama-3.1-8B-German-ORPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nekochu/Llama-3.1-8B-German-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nekochu/Llama-3.1-8B-German-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nekochu__Llama-3.1-8B-German-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nekochu/Llama-3.1-8B-German-ORPO
463ea77e46fb6d69c86f23df21b0ab0a0b9e77cd
23.254052
llama3.1
1
8.03
true
false
false
false
1.888395
0.461071
46.107107
0.498258
29.419254
0.117069
11.706949
0.316275
8.836689
0.46475
16.860417
0.339345
26.593898
false
false
2024-09-13
2024-09-24
2
meta-llama/Meta-Llama-3.1-8B
Nekochu_Llama-3.1-8B-french-DPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nekochu/Llama-3.1-8B-french-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nekochu/Llama-3.1-8B-french-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nekochu__Llama-3.1-8B-french-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nekochu/Llama-3.1-8B-french-DPO
b0c66dd2a2814a6bfb05313ffec856fd4c6c7bd7
21.701292
apache-2.0
1
8.03
true
false
false
false
1.642415
0.465642
46.564227
0.511089
30.032597
0.097432
9.743202
0.291107
5.480984
0.421563
11.561979
0.341423
26.824764
false
false
2024-08-12
2024-10-12
1
NousResearch/Meta-Llama-3.1-8B-Instruct
Nekochu_Luminia-13B-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nekochu/Luminia-13B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nekochu/Luminia-13B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nekochu__Luminia-13B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nekochu/Luminia-13B-v3
602563f3af32b3c6be067ad522e6f3eaff4f8627
11.635077
apache-2.0
6
13.016
true
false
false
false
2.277759
0.252318
25.231829
0.411215
17.690524
0.018127
1.812689
0.270134
2.684564
0.398333
8.891667
0.221493
13.499187
false
false
2024-03-18
2024-09-25
1
meta-llama/Llama-2-13b-chat-hf
Nekochu_Luminia-8B-RP_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nekochu/Luminia-8B-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nekochu/Luminia-8B-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nekochu__Luminia-8B-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nekochu/Luminia-8B-RP
619be17206729d86b898b9d1b3369a7135c1a9b9
24.618093
apache-2.0
4
8.03
true
false
false
false
1.905195
0.557417
55.741654
0.521815
31.802699
0.135952
13.595166
0.29698
6.263982
0.39976
11.070052
0.363115
29.235003
false
false
2024-09-13
2024-09-24
2
meta-llama/Meta-Llama-3.1-8B
NeverSleep_Lumimaid-v0.2-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NeverSleep/Lumimaid-v0.2-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NeverSleep/Lumimaid-v0.2-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NeverSleep__Lumimaid-v0.2-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NeverSleep/Lumimaid-v0.2-12B
b04f4e8f9a0c64fbb271d1135b208c90c3aa0ad0
18.147314
cc-by-nc-4.0
94
12.248
true
false
false
false
3.128387
0.109935
10.993497
0.539561
34.409889
0.056647
5.664653
0.314597
8.612975
0.482115
21.297656
0.351147
27.905216
false
false
2024-07-25
2024-07-31
0
NeverSleep/Lumimaid-v0.2-12B
NeverSleep_Lumimaid-v0.2-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NeverSleep/Lumimaid-v0.2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NeverSleep/Lumimaid-v0.2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NeverSleep__Lumimaid-v0.2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NeverSleep/Lumimaid-v0.2-8B
4563201f29ef18c62d16e9f6fffd3931a63ccb51
24.411997
cc-by-nc-4.0
71
8.03
true
false
false
false
1.479391
0.503811
50.3811
0.523777
31.963374
0.143505
14.350453
0.311242
8.165548
0.430302
12.321094
0.363614
29.290411
false
false
2024-07-24
2024-08-09
0
NeverSleep/Lumimaid-v0.2-8B
Nexesenex_Dolphin3.0-Llama3.1-1B-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Dolphin3.0-Llama3.1-1B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Dolphin3.0-Llama3.1-1B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Dolphin3.0-Llama3.1-1B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Dolphin3.0-Llama3.1-1B-abliterated
3ab9a1cebae25ff08ad915328466b75b5dc8f860
11.378392
llama3.1
0
1.236
true
false
false
true
0.345112
0.531188
53.118836
0.324079
6.862079
0.03852
3.851964
0.240772
0
0.323677
0.292969
0.137301
4.144504
false
false
2025-03-01
2025-03-05
1
Nexesenex/Dolphin3.0-Llama3.1-1B-abliterated (Merge)
Nexesenex_Llama_3.1_8b_DeepDive_3_Prev_v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DeepDive_3_Prev_v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DeepDive_3_Prev_v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DeepDive_3_Prev_v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DeepDive_3_Prev_v1.0
7ac196a06fdc9e86b08a13cde4c76a38913dd647
26.648391
llama3.1
3
8.03
true
false
false
true
0.685187
0.680914
68.091442
0.51551
31.122747
0.186556
18.655589
0.291107
5.480984
0.366583
9.45625
0.34375
27.083333
true
false
2025-02-17
2025-02-28
1
Nexesenex/Llama_3.1_8b_DeepDive_3_Prev_v1.0 (Merge)
Nexesenex_Llama_3.1_8b_DeepDive_3_R1_Prev_v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DeepDive_3_R1_Prev_v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DeepDive_3_R1_Prev_v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DeepDive_3_R1_Prev_v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DeepDive_3_R1_Prev_v1.0
c1a135395817932a87ae985fc7413078e52d0470
27.472784
llama3.1
3
8.03
true
false
false
true
1.383678
0.71009
71.009034
0.512036
30.732784
0.192598
19.259819
0.300336
6.711409
0.37576
10.003385
0.344082
27.120272
true
false
2025-02-17
2025-02-27
1
Nexesenex/Llama_3.1_8b_DeepDive_3_R1_Prev_v1.0 (Merge)
Nexesenex_Llama_3.1_8b_DobHerWild_R1_v1.1R_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DobHerWild_R1_v1.1R" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DobHerWild_R1_v1.1R</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DobHerWild_R1_v1.1R-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DobHerWild_R1_v1.1R
0f5f7b67439f16e1a82e7fdcd1f9cc771eb97e5e
29.728398
llama3.1
3
8.03
true
false
false
true
0.697469
0.759999
75.999902
0.525696
32.82575
0.231873
23.187311
0.299497
6.599553
0.385219
9.885677
0.36885
29.872193
true
false
2025-02-18
2025-02-27
1
Nexesenex/Llama_3.1_8b_DobHerWild_R1_v1.1R (Merge)
Nexesenex_Llama_3.1_8b_DoberWild_v2.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DoberWild_v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DoberWild_v2.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DoberWild_v2.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DoberWild_v2.01
560554a7ce692b74c08aa0f69f60b9edaa67e136
30.164382
llama3.1
3
8.031
true
false
false
true
0.65688
0.799566
79.956626
0.525077
32.377759
0.200151
20.015106
0.302852
7.04698
0.401188
10.581771
0.379072
31.008053
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_DoberWild_v2.01 (Merge)
Nexesenex_Llama_3.1_8b_DoberWild_v2.02_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DoberWild_v2.02" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DoberWild_v2.02</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DoberWild_v2.02-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DoberWild_v2.02
fbc865067c1ef712763598fd8ee46aa6a4932d72
29.47563
llama3.1
2
8.03
true
false
false
true
0.684438
0.774637
77.463685
0.531274
33.353313
0.199396
19.939577
0.294463
5.928412
0.394583
9.45625
0.376413
30.712544
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_DoberWild_v2.02 (Merge)
Nexesenex_Llama_3.1_8b_DoberWild_v2.03_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DoberWild_v2.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DoberWild_v2.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DoberWild_v2.03-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DoberWild_v2.03
c90f034cc7badd745115f53064b029fce64c3c16
29.82477
llama3.1
4
8.03
true
false
false
true
0.669487
0.776435
77.643541
0.529443
33.032829
0.207704
20.770393
0.30453
7.270694
0.390583
9.989583
0.372174
30.241578
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_DoberWild_v2.03 (Merge)
Nexesenex_Llama_3.1_8b_DodoWild_v2.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DodoWild_v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DodoWild_v2.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DodoWild_v2.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DodoWild_v2.01
5b6413555d32172bc6372075f0a31fbf7895723b
30.309649
llama3.1
2
8.031
true
false
false
true
0.69176
0.797768
79.77677
0.525276
32.110872
0.19864
19.864048
0.303691
7.158837
0.408969
12.521094
0.373836
30.426271
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_DodoWild_v2.01 (Merge)
Nexesenex_Llama_3.1_8b_DodoWild_v2.02_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DodoWild_v2.02" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DodoWild_v2.02</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DodoWild_v2.02-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DodoWild_v2.02
30.733559
llama3.1
2
8.03
true
false
false
true
0.686035
0.80169
80.168952
0.526174
32.319153
0.227341
22.734139
0.30453
7.270694
0.397062
11.232812
0.37608
30.675606
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_DodoWild_v2.02 (Merge)
Nexesenex_Llama_3.1_8b_DodoWild_v2.03_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DodoWild_v2.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DodoWild_v2.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DodoWild_v2.03-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DodoWild_v2.03
68b895fd2242abe1115c2bc4a13fdb7bb70b1811
30.60439
llama3.1
4
8.03
true
false
false
true
0.684583
0.794121
79.412071
0.530825
33.02296
0.222054
22.205438
0.307886
7.718121
0.395854
10.315104
0.378574
30.952645
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_DodoWild_v2.03 (Merge)
Nexesenex_Llama_3.1_8b_DodoWild_v2.10_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DodoWild_v2.10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DodoWild_v2.10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DodoWild_v2.10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DodoWild_v2.10
143314ac0b5bb2bd2bb6c05c3c167e534e291a24
30.406547
llama3.1
1
8.03
true
false
false
true
0.66328
0.805386
80.538637
0.527836
32.758078
0.19713
19.712991
0.296141
6.152125
0.415667
11.558333
0.385472
31.719119
true
false
2025-02-28
2025-03-01
1
Nexesenex/Llama_3.1_8b_DodoWild_v2.10 (Merge)
Nexesenex_Llama_3.1_8b_Dolermed_R1_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Dolermed_R1_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.01
34f6ec6d3fa767d2dbaed76ede30d3bd4d09617c
29.312043
llama3.1
2
8.03
true
false
false
true
0.711628
0.753354
75.335443
0.531239
33.287932
0.201662
20.166163
0.305369
7.38255
0.374708
9.338542
0.373255
30.361628
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.01 (Merge)
Nexesenex_Llama_3.1_8b_Dolermed_R1_V1.03_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Dolermed_R1_V1.03-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.03
a63fd02d6595cb821cdb2f934d12e90a8da6016a
29.789323
llama3.1
2
8.03
true
false
false
true
0.686862
0.756402
75.64019
0.531645
33.285577
0.209215
20.92145
0.317953
9.060403
0.380042
9.605208
0.372008
30.223109
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Dolermed_R1_V1.03 (Merge)
Nexesenex_Llama_3.1_8b_Dolermed_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Dolermed_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Dolermed_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Dolermed_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Dolermed_V1.01
3bbe50712a23b6d8711a1e628fcea1bc40178982
23.452683
llama3.1
4
8.031
true
false
false
true
0.747654
0.508657
50.865703
0.519362
31.705803
0.134441
13.444109
0.294463
5.928412
0.39449
10.211198
0.357048
28.560875
true
false
2025-02-27
2025-02-28
1
Nexesenex/Llama_3.1_8b_Dolermed_V1.01 (Merge)
Nexesenex_Llama_3.1_8b_Dolerstormed_V1.04_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Dolerstormed_V1.04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Dolerstormed_V1.04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Dolerstormed_V1.04-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Dolerstormed_V1.04
a80370b825f9f4fa00daecc44433035f3f48aa2a
30.113033
llama3.1
0
8.03
true
false
false
true
0.657283
0.7889
78.890012
0.519518
31.641152
0.192598
19.259819
0.322148
9.619687
0.402958
9.169792
0.38888
32.097739
true
false
2025-02-28
2025-03-01
1
Nexesenex/Llama_3.1_8b_Dolerstormed_V1.04 (Merge)
Nexesenex_Llama_3.1_8b_Hermedash_R1_V1.04_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Hermedash_R1_V1.04-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04
8625849c98769300f69b6b7695e02482c7f4f0b3
30.27777
llama3.1
1
8.03
true
false
false
true
0.669668
0.787151
78.715142
0.519164
31.658972
0.186556
18.655589
0.322987
9.731544
0.411052
10.88151
0.388215
32.023862
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04 (Merge)
Nexesenex_Llama_3.1_8b_Hermedive_R1_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Hermedive_R1_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.01
fce000e697ca714141a0962e5e50ab2a1d58f680
23.809187
llama3.1
3
8.03
true
false
false
true
0.755606
0.500114
50.011414
0.517086
31.129962
0.177492
17.749245
0.282718
4.362416
0.400844
12.638802
0.34267
26.963283
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.01 (Merge)
Nexesenex_Llama_3.1_8b_Hermedive_R1_V1.03_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Hermedive_R1_V1.03-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.03
4859975e28c66fe2461474883a22fc4ac6d7561d
26.501633
llama3.1
2
8.03
true
false
false
true
0.691499
0.664753
66.475286
0.514079
30.801238
0.185801
18.58006
0.297819
6.375839
0.361313
9.130729
0.34882
27.646646
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Hermedive_R1_V1.03 (Merge)
Nexesenex_Llama_3.1_8b_Hermedive_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Hermedive_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Hermedive_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Hermedive_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Hermedive_V1.01
9037f457eff988502bca12f4f69a7a7dc0f4011e
22.788859
llama3.1
3
8.031
true
false
false
true
0.70534
0.506159
50.615921
0.49182
27.64843
0.164653
16.465257
0.28943
5.257271
0.369656
8.407031
0.355053
28.339243
true
false
2025-02-27
2025-02-28
1
Nexesenex/Llama_3.1_8b_Hermedive_V1.01 (Merge)
Nexesenex_Llama_3.1_8b_Mediver_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Mediver_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Mediver_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Mediver_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Mediver_V1.01
458b223cf66a9057d048d3e1bca609c1ad952ec3
11.984272
llama3.1
0
8.031
true
false
false
true
0.747914
0.188471
18.847103
0.441483
20.441503
0.001511
0.151057
0.277685
3.691275
0.389781
6.622656
0.299368
22.152039
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_Mediver_V1.01 (Merge)
Nexesenex_Llama_3.1_8b_Medusa_v1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Medusa_v1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Medusa_v1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Medusa_v1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Medusa_v1.01
c133aae6cba4c86a8aabd686d6f137c34fdf67f0
27.381683
llama3.1
3
8.031
true
false
false
true
0.650869
0.768542
76.854191
0.501773
30.029014
0.146526
14.652568
0.291946
5.592841
0.406677
9.034635
0.353142
28.126847
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_Medusa_v1.01 (Merge)
Nexesenex_Llama_3.1_8b_Smarteaz_0.2_R1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Smarteaz_0.2_R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Smarteaz_0.2_R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Smarteaz_0.2_R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Smarteaz_0.2_R1
c67ac7a09e51cff541126912a29d7b2985d210ce
28.105108
llama3.1
3
8.03
true
false
false
true
0.731015
0.634553
63.455299
0.51125
30.697617
0.260574
26.057402
0.300336
6.711409
0.418802
12.316927
0.364528
29.391992
true
false
2025-02-17
2025-02-27
1
Nexesenex/Llama_3.1_8b_Smarteaz_0.2_R1 (Merge)
Nexesenex_Llama_3.1_8b_Smarteaz_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Smarteaz_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Smarteaz_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Smarteaz_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Smarteaz_V1.01
195c80b1bf8431108d0f7bb87c3e84277793f437
30.623634
llama3.1
3
8.03
true
false
false
true
0.699303
0.815128
81.51283
0.524127
32.275457
0.234139
23.413897
0.309564
7.941834
0.378927
8.199219
0.373587
30.398567
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_Smarteaz_V1.01 (Merge)
Nexesenex_Llama_3.1_8b_Stormeder_v1.04_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Stormeder_v1.04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Stormeder_v1.04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Stormeder_v1.04-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Stormeder_v1.04
4c0c354283e2fc130f5fb09c9d2ca23a0e5ea0d7
29.709952
llama3.1
1
8.03
true
false
false
true
0.654617
0.785253
78.525313
0.520709
31.780499
0.185045
18.504532
0.32047
9.395973
0.394896
8.361979
0.385223
31.691415
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Stormeder_v1.04 (Merge)
Nexesenex_Llama_3.1_8b_Typhoon_v1.03_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Typhoon_v1.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Typhoon_v1.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Typhoon_v1.03-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Typhoon_v1.03
67b0226762ae10e3600657831bb0f5e144057036
30.634802
llama3.1
1
8.03
true
false
false
true
0.686383
0.807834
80.783432
0.531397
33.32078
0.227341
22.734139
0.307047
7.606264
0.381469
7.783594
0.384225
31.5806
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Typhoon_v1.03 (Merge)
Nexesenex_Llama_3.2_1b_AquaSyn_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_AquaSyn_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_AquaSyn_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_AquaSyn_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_AquaSyn_0.1
10be340c512c912bb0c80bb528534cf7ceab5d3c
6.988906
llama3.2
0
1.498
true
false
false
false
0.372092
0.2741
27.41005
0.328436
6.212571
0.021903
2.190332
0.248322
0
0.346031
1.920573
0.137799
4.199911
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_AquaSyn_0.1 (Merge)
Nexesenex_Llama_3.2_1b_AquaSyn_0.11_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_AquaSyn_0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_AquaSyn_0.11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_AquaSyn_0.11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_AquaSyn_0.11
450eb37685485fe0cbed86f7bfe1bec224945676
5.867027
0
1.498
false
false
false
true
0.393395
0.243126
24.312602
0.311196
3.648692
0.023414
2.34139
0.265101
2.013423
0.33676
1.595052
0.111619
1.291002
false
false
2025-02-26
2025-02-26
1
Nexesenex/Llama_3.2_1b_AquaSyn_0.11 (Merge)
Nexesenex_Llama_3.2_1b_Dolto_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Dolto_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Dolto_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Dolto_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Dolto_0.1
36b009df0e265a4b67a581dbec03c2ea5f6a317d
11.865611
llama3.2
1
1.498
true
false
false
true
0.369569
0.543378
54.337824
0.335006
6.613057
0.037009
3.700906
0.237416
0
0.342125
2.498958
0.136386
4.042923
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_Dolto_0.1 (Merge)
Nexesenex_Llama_3.2_1b_Odyssea_V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Odyssea_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Odyssea_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Odyssea_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Odyssea_V1
bd402805c92e2b707afa506950e4057226104021
5.724136
llama3.2
0
1.498
true
false
false
true
0.371238
0.255266
25.526603
0.300972
2.646703
0.01435
1.435045
0.258389
1.118568
0.339365
1.920573
0.115276
1.697326
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_Odyssea_V1 (Merge)
Nexesenex_Llama_3.2_1b_Odyssea_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Odyssea_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Odyssea_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Odyssea_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Odyssea_V1.01
802bdf9518add0247fa3abfd0d2039c4299c36f7
5.655298
0
1.498
false
false
false
true
0.388802
0.249546
24.954565
0.304465
2.848404
0.017372
1.73716
0.255872
0.782998
0.342031
1.920573
0.115193
1.688091
false
false
2025-02-26
2025-02-26
1
Nexesenex/Llama_3.2_1b_Odyssea_V1.01 (Merge)
Nexesenex_Llama_3.2_1b_OpenTree_R1_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_OpenTree_R1_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1
84f5694efc7f14c835b07b961b924bf74033b841
12.343256
llama3.2
1
1.498
true
false
false
true
0.366972
0.536634
53.663391
0.327952
6.20559
0.047583
4.758308
0.252517
0.33557
0.313073
1.6
0.16747
7.496676
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1 (Merge)
Nexesenex_Llama_3.2_1b_OrcaSun_V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_OrcaSun_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_OrcaSun_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_OrcaSun_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_OrcaSun_V1
e27d69f429dacb17ec280ea3af055f0a951a77d0
14.801538
llama3.2
0
1.498
true
false
false
true
0.364958
0.594861
59.486053
0.355031
9.790398
0.059668
5.966767
0.236577
0
0.338031
3.520573
0.190409
10.045434
true
false
2025-03-06
2025-03-06
1
Nexesenex/Llama_3.2_1b_OrcaSun_V1 (Merge)
Nexesenex_Llama_3.2_1b_RandomLego_RP_R1_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_RandomLego_RP_R1_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_RandomLego_RP_R1_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_RandomLego_RP_R1_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_RandomLego_RP_R1_0.1
b9f06764ed2b70d955cb320bbeb574b85a11ba6e
12.811633
llama3.2
2
1.498
true
false
false
true
0.358408
0.554269
55.426934
0.342771
7.937728
0.056647
5.664653
0.25
0
0.324917
1.58125
0.156333
6.259235
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_RandomLego_RP_R1_0.1 (Merge)
Nexesenex_Llama_3.2_1b_SunOrca_V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_SunOrca_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_SunOrca_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_SunOrca_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_SunOrca_V1
d4113f570bf95a718f508728fc59f6e99080d5a7
14.008724
llama3.2
0
1.498
true
false
false
true
0.3575
0.542954
54.295381
0.343064
7.852676
0.067221
6.722054
0.274329
3.243848
0.32625
2.114583
0.188414
9.823803
true
false
2025-03-06
2025-03-06
1
Nexesenex/Llama_3.2_1b_SunOrca_V1 (Merge)
Nexesenex_Llama_3.2_1b_Sydonia_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Sydonia_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Sydonia_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Sydonia_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Sydonia_0.1
3a4d831813bcbde4e0027e37a986cec773802aa5
5.524506
llama3.2
0
1.498
true
false
false
false
0.370707
0.21967
21.967047
0.312109
4.742439
0.020393
2.039275
0.228188
0
0.338188
1.906771
0.122424
2.491504
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_Sydonia_0.1 (Merge)
Nexesenex_Llama_3.2_1b_Syneridol_0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Syneridol_0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Syneridol_0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Syneridol_0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Syneridol_0.2
1b09dee1571a9e973e82d1fef4e33e469dc281d6
5.412093
llama3.2
0
1.498
true
false
false
false
0.378186
0.215749
21.574866
0.313885
4.769663
0.021903
2.190332
0.234899
0
0.334281
1.41849
0.122673
2.519208
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_Syneridol_0.2 (Merge)
Nexesenex_Llama_3.2_1b_Synopsys_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Synopsys_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Synopsys_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Synopsys_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Synopsys_0.1
9b47c5de30189f552a093684e916f945ea8b49fe
4.959889
llama3.2
0
1.498
true
false
false
false
0.377302
0.176381
17.638089
0.316194
4.965846
0.016616
1.661631
0.239094
0
0.346094
2.928385
0.123088
2.565381
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_Synopsys_0.1 (Merge)
Nexesenex_Llama_3.2_1b_Synopsys_0.11_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Synopsys_0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Synopsys_0.11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Synopsys_0.11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Synopsys_0.11
c0b74cb7b633876de47a3033fe2ef23c3f9853ca
6.557109
0
1.498
false
false
false
true
0.399945
0.284217
28.421699
0.310197
3.444234
0.01284
1.283988
0.262584
1.677852
0.351333
3.15
0.112284
1.364879
false
false
2025-02-26
2025-02-26
1
Nexesenex/Llama_3.2_1b_Synopsys_0.11 (Merge)
Nexesenex_Llama_3.2_3b_Kermes_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_3b_Kermes_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_3b_Kermes_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_3b_Kermes_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_3b_Kermes_v1
0a8ba971a51571175580b19cb9696169fe837807
17.069896
llama3.2
2
3.213
true
false
false
true
0.585646
0.485176
48.5176
0.440991
21.169131
0.030967
3.096677
0.27349
3.131991
0.407021
9.310938
0.254737
17.193041
true
false
2025-02-11
2025-02-13
1
Nexesenex/Llama_3.2_3b_Kermes_v1 (Merge)
Nexesenex_Llama_3.2_3b_Kermes_v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_3b_Kermes_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_3b_Kermes_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_3b_Kermes_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_3b_Kermes_v2
722ae4841fa96ad40b98d6d95931b592a8cc256e
18.494972
llama3.2
2
3.213
true
false
false
true
0.593094
0.575377
57.537667
0.445545
22.049945
0.054381
5.438066
0.265101
2.013423
0.377812
4.659896
0.273438
19.270833
true
false
2025-02-12
2025-02-12
1
Nexesenex/Llama_3.2_3b_Kermes_v2 (Merge)
Nexesenex_Llama_3.2_3b_Kermes_v2.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_3b_Kermes_v2.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_3b_Kermes_v2.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_3b_Kermes_v2.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_3b_Kermes_v2.1
e1234eecf6f6de09726f04c884eb144f0fd53c66
18.907118
llama3.2
2
3.213
true
false
false
true
0.576267
0.558391
55.839063
0.44639
22.16637
0.052115
5.21148
0.279362
3.914989
0.396354
7.510938
0.269199
18.799867
true
false
2025-02-13
2025-02-13
1
Nexesenex/Llama_3.2_3b_Kermes_v2.1 (Merge)
Nexesenex_Nemotron_W_4b_Halo_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Nemotron_W_4b_Halo_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Nemotron_W_4b_Halo_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Nemotron_W_4b_Halo_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Nemotron_W_4b_Halo_0.1
762d22637e0b1523dfcd53c001cb4613312ee658
15.094246
3
4.513
false
false
false
true
0.71898
0.362728
36.272756
0.41351
18.550389
0.042296
4.229607
0.280201
4.026846
0.41651
10.763802
0.250499
16.722074
false
false
2025-02-16
2025-02-21
1
Nexesenex/Nemotron_W_4b_Halo_0.1 (Merge)
Nexesenex_Nemotron_W_4b_MagLight_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Nemotron_W_4b_MagLight_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Nemotron_W_4b_MagLight_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Nemotron_W_4b_MagLight_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Nemotron_W_4b_MagLight_0.1
7cef3440b378aa345e3db26db8ee69e8ece7fd8a
16.194617
3
4.513
false
false
false
true
0.664536
0.423028
42.302757
0.423141
19.287938
0.04003
4.003021
0.283557
4.474273
0.411208
9.934375
0.254488
17.165337
false
false
2025-02-16
2025-02-21
1
Nexesenex/Nemotron_W_4b_MagLight_0.1 (Merge)
Nexesenex_Qwen_2.5_3b_Smarteaz_0.01a_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Qwen_2.5_3b_Smarteaz_0.01a" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Qwen_2.5_3b_Smarteaz_0.01a</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Qwen_2.5_3b_Smarteaz_0.01a-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Qwen_2.5_3b_Smarteaz_0.01a
c887feb9b69e73b424f39b22828f9b911da50d66
19.961623
1
3.085
false
false
false
true
0.728022
0.401195
40.119549
0.463665
24.370417
0.180514
18.05136
0.277685
3.691275
0.432042
12.871875
0.285987
20.665263
false
false
2025-02-27
2025-02-27
1
Nexesenex/Qwen_2.5_3b_Smarteaz_0.01a (Merge)
Nexesenex_pankajmathur_orca_mini_v9_6_1B-instruct-Abliterated-LPL_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/pankajmathur_orca_mini_v9_6_1B-instruct-Abliterated-LPL" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/pankajmathur_orca_mini_v9_6_1B-instruct-Abliterated-LPL</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__pankajmathur_orca_mini_v9_6_1B-instruct-Abliterated-LPL-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/pankajmathur_orca_mini_v9_6_1B-instruct-Abliterated-LPL
8c1916bcfe68de1f2fbd47de8846c9879f81115e
15.011892
llama3.2
0
1.236
true
false
false
true
0.343987
0.588991
58.899055
0.356249
9.728333
0.074773
7.477341
0.266779
2.237136
0.339552
2.810677
0.180269
8.918809
false
false
2025-03-01
2025-03-05
1
Nexesenex/pankajmathur_orca_mini_v9_6_1B-instruct-Abliterated-LPL (Merge)
Nexusflow_NexusRaven-V2-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexusflow/NexusRaven-V2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexusflow/NexusRaven-V2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexusflow__NexusRaven-V2-13B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexusflow/NexusRaven-V2-13B
cdab7132db4a4fd64513123374ea1451d85a7ace
8.488065
other
466
13
true
false
false
false
2.17961
0.179078
17.907818
0.394886
15.336448
0.029456
2.945619
0.260067
1.342282
0.373688
3.710937
0.187168
9.685284
false
true
2023-12-04
2024-06-12
1
codellama/CodeLlama-13b-Instruct-hf
NikolaSigmoid_AceMath-1.5B-Instruct-1epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/AceMath-1.5B-Instruct-1epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/AceMath-1.5B-Instruct-1epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__AceMath-1.5B-Instruct-1epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/AceMath-1.5B-Instruct-1epoch
166818c371eaafb212b243aecadd50b1079fa776
17.507478
0
1.791
false
false
false
false
1.314121
0.284892
28.489186
0.426285
19.829634
0.305136
30.513595
0.277685
3.691275
0.39251
7.230469
0.237616
15.290706
false
false
2025-03-05
2025-03-06
0
NikolaSigmoid/AceMath-1.5B-Instruct-1epoch
NikolaSigmoid_AceMath-1.5B-Instruct-dolphin-r1-200_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/AceMath-1.5B-Instruct-dolphin-r1-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/AceMath-1.5B-Instruct-dolphin-r1-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__AceMath-1.5B-Instruct-dolphin-r1-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/AceMath-1.5B-Instruct-dolphin-r1-200
1e4805e2d12d993d0fd155af8a5ba420d7abc0b4
4.389864
apache-2.0
1
0.928
true
false
false
false
3.36306
0.180802
18.080249
0.28148
1.5863
0
0
0.255872
0.782998
0.374958
4.303125
0.114279
1.58651
false
false
2025-03-05
2025-03-06
1
nvidia/AceMath-1.5B-Instruct
NikolaSigmoid_DeepSeek-R1-Distill-Qwen-1.5B-500_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/DeepSeek-R1-Distill-Qwen-1.5B-500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/DeepSeek-R1-Distill-Qwen-1.5B-500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__DeepSeek-R1-Distill-Qwen-1.5B-500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/DeepSeek-R1-Distill-Qwen-1.5B-500
54484019f17f6f14964cee95f7386bbaee385374
3.578856
apache-2.0
0
1.157
true
false
false
false
0.958168
0.174857
17.485716
0.26016
0.357981
0
0
0.245805
0
0.337969
2.246094
0.11245
1.383348
false
false
2025-03-04
2025-03-04
2
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
NikolaSigmoid_acemath-200_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/acemath-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/acemath-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__acemath-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/acemath-200
166818c371eaafb212b243aecadd50b1079fa776
17.507478
apache-2.0
0
1.791
true
false
false
false
1.310059
0.284892
28.489186
0.426285
19.829634
0.305136
30.513595
0.277685
3.691275
0.39251
7.230469
0.237616
15.290706
false
false
2025-03-04
2025-03-04
1
nvidia/AceMath-1.5B-Instruct
NikolaSigmoid_phi-4-14b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/phi-4-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/phi-4-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__phi-4-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/phi-4-14b
c6220bde10fff762dbd72c3331894aa4cade249d
29.91384
apache-2.0
0
14.704
true
false
false
false
1.351959
0.056079
5.607898
0.6695
52.500693
0.293807
29.380665
0.403523
20.469799
0.504688
23.985937
0.527842
47.538047
false
false
2025-02-26
2025-02-28
2
microsoft/phi-4
NikolaSigmoid_phi-4-1steps_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/phi-4-1steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/phi-4-1steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__phi-4-1steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/phi-4-1steps
c6220bde10fff762dbd72c3331894aa4cade249d
29.870388
apache-2.0
0
14.704
true
false
false
false
1.345177
0.052757
5.275669
0.670736
52.760921
0.298338
29.833837
0.401846
20.246085
0.502052
23.623177
0.527344
47.482639
false
false
2025-03-06
2025-03-06
2
microsoft/phi-4
NikolaSigmoid_phi-4-300steps_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
<a target="_blank" href="https://huggingface.co/NikolaSigmoid/phi-4-300steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NikolaSigmoid/phi-4-300steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NikolaSigmoid__phi-4-300steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NikolaSigmoid/phi-4-300steps
c6220bde10fff762dbd72c3331894aa4cade249d
29.96026
apache-2.0
0
14.704
true
false
false
false
2.712571
0.056079
5.607898
0.670112
52.645059
0.294562
29.456193
0.405201
20.693512
0.503354
23.719271
0.528757
47.639628
false
false
2025-03-06
2025-03-07
2
microsoft/phi-4
Nitral-AI_Captain-Eris-BMO_Violent-GRPO-v0.420_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Captain-Eris-BMO_Violent-GRPO-v0.420" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Captain-Eris-BMO_Violent-GRPO-v0.420</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Captain-Eris-BMO_Violent-GRPO-v0.420-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Captain-Eris-BMO_Violent-GRPO-v0.420
d02fedecc3401123516c837421727997b9d8a218
25.923807
other
3
12.248
true
false
false
true
1.51122
0.631281
63.128056
0.507853
30.433552
0.13142
13.141994
0.309564
7.941834
0.422802
12.05026
0.359624
28.847148
true
false
2025-02-23
2025-02-23
1
Nitral-AI/Captain-Eris-BMO_Violent-GRPO-v0.420 (Merge)
Nitral-AI_Captain-Eris_BMO-Violent-12B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Captain-Eris_BMO-Violent-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Captain-Eris_BMO-Violent-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Captain-Eris_BMO-Violent-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Captain-Eris_BMO-Violent-12B
2af7bb5641c77a215f2685b02db7427ab2831a6e
25.922969
other
2
12.248
true
false
false
true
1.459001
0.615219
61.521873
0.510437
31.041897
0.136707
13.670695
0.309564
7.941834
0.425531
12.791406
0.357131
28.570109
true
false
2025-02-23
2025-02-23
1
Nitral-AI/Captain-Eris_BMO-Violent-12B (Merge)
Nitral-AI_Captain-Eris_Violet-GRPO-v0.420_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Captain-Eris_Violet-GRPO-v0.420" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Captain-Eris_Violet-GRPO-v0.420</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Captain-Eris_Violet-GRPO-v0.420-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
2faf48adc3490fed4fabd4faf8becda866527139
25.272147
other
20
12.248
true
false
false
true
0.787861
0.62616
62.61597
0.515921
31.108574
0.108006
10.800604
0.298658
6.487696
0.427917
12.45625
0.353474
28.163785
true
false
2025-02-17
2025-02-17
1
Nitral-AI/Captain-Eris_Violet-GRPO-v0.420 (Merge)
Nitral-AI_Captain-Eris_Violet-V0.420-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Captain-Eris_Violet-V0.420-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Captain-Eris_Violet-V0.420-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Captain-Eris_Violet-V0.420-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Captain-Eris_Violet-V0.420-12B
b1a87ce62601e19fff206a16590d28f009965799
23.626621
other
34
12.248
true
false
false
false
3.031157
0.433919
43.391867
0.54781
35.326941
0.107251
10.725076
0.311242
8.165548
0.433062
13.899479
0.372257
30.250813
true
false
2024-12-16
2025-01-04
1
Nitral-AI/Captain-Eris_Violet-V0.420-12B (Merge)
Nitral-AI_Captain_BMO-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Captain_BMO-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Captain_BMO-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Captain_BMO-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Captain_BMO-12B
ba2950f1c9831c6aacd6141851e7b9724be6759a
23.210485
other
20
12.248
true
false
false
false
3.424295
0.47506
47.505951
0.528596
32.440698
0.139728
13.97281
0.319631
9.284116
0.374802
7.516927
0.356882
28.542405
false
false
2024-11-01
2025-01-04
0
Nitral-AI/Captain_BMO-12B
Nitral-AI_Hathor_Stable-v0.2-L3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Hathor_Stable-v0.2-L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Hathor_Stable-v0.2-L3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Hathor_Stable-v0.2-L3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Hathor_Stable-v0.2-L3-8B
1c9f391c3e349f8ba51b5696290ee6db6a2b63fd
25.917957
other
61
8.03
true
false
false
true
1.616846
0.717484
71.748405
0.528582
32.826029
0.104985
10.498489
0.286913
4.9217
0.378063
5.557813
0.369598
29.955304
false
false
2024-06-09
2024-07-02
0
Nitral-AI/Hathor_Stable-v0.2-L3-8B
Nitral-AI_Hathor_Tahsin-L3-8B-v0.85_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Hathor_Tahsin-L3-8B-v0.85" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Hathor_Tahsin-L3-8B-v0.85</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Hathor_Tahsin-L3-8B-v0.85-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
755c5684c3a1dd68df409e0e32b481b707811a50
25.499608
other
19
8.03
true
false
false
true
1.209052
0.711015
71.101455
0.527904
32.713113
0.100453
10.045317
0.285235
4.697987
0.364667
4.216667
0.372008
30.223109
false
false
2024-07-09
2025-01-26
0
Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
Nitral-AI_Nera_Noctis-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nitral-AI/Nera_Noctis-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nitral-AI/Nera_Noctis-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nitral-AI__Nera_Noctis-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nitral-AI/Nera_Noctis-12B
f18471562642499508a26c7d84a5e25b0cd51897
20.661352
other
12
12.248
true
false
false
true
1.848759
0.456175
45.617517
0.519368
31.869591
0.087613
8.761329
0.263423
1.789709
0.397906
8.504948
0.346825
27.425015
false
false
2025-01-01
2025-01-26
0
Nitral-AI/Nera_Noctis-12B
Nohobby_MS-Schisandra-22B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nohobby/MS-Schisandra-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nohobby/MS-Schisandra-22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nohobby__MS-Schisandra-22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nohobby/MS-Schisandra-22B-v0.1
df698b7b740fb3b5193d61cd51e5e3a42c3b1e1c
30.111644
other
5
22.247
true
false
false
false
3.201107
0.633129
63.312899
0.578995
40.0114
0.22281
22.280967
0.332215
10.961969
0.392844
9.705469
0.409574
34.397163
true
false
2024-10-26
2024-10-30
1
Nohobby/MS-Schisandra-22B-v0.1 (Merge)
Nohobby_MS-Schisandra-22B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Nohobby/MS-Schisandra-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nohobby/MS-Schisandra-22B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nohobby__MS-Schisandra-22B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nohobby/MS-Schisandra-22B-v0.2
257b6d38d2f1c2a607c38a6a86336a241a81a455
30.281489
other
9
22.247
true
false
false
false
2.070016
0.6383
63.829971
0.584122
40.614458
0.203172
20.317221
0.33557
11.409396
0.407479
10.668229
0.413647
34.84966
true
false
2024-11-02
2024-11-02
1
Nohobby/MS-Schisandra-22B-v0.2 (Merge)
Norquinal_Alpha_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Alpha
e873a2f862b6511d96de50be147e5b3d73d36afd
12.081123
0
8.03
false
false
false
true
1.435562
0.280295
28.029517
0.337365
8.664581
0.057402
5.740181
0.265101
2.013423
0.363083
5.785417
0.300283
22.25362
false
false
2024-11-30
0
Removed
Norquinal_Bravo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Bravo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Bravo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Bravo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Bravo
40ae6eda335ee30028fa907ec71e501b46a27f45
13.664478
0
8.03
false
false
false
true
1.394501
0.302452
30.245194
0.355843
10.654044
0.057402
5.740181
0.281879
4.250559
0.386865
7.458073
0.312749
23.638815
false
false
2024-11-30
0
Removed
Norquinal_Charlie_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Charlie" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Charlie</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Charlie-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Charlie
e93fc3dfad9feb74d0a38f84bd42037f49482635
13.162885
0
8.03
false
false
false
true
1.292958
0.306099
30.609893
0.351529
9.860055
0.058157
5.81571
0.270973
2.796421
0.373688
6.644271
0.309259
23.25096
false
false
2024-11-30
0
Removed
Norquinal_Delta_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Delta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Delta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Delta
acfee7dcb17b597d7278415873571b979b545c8a
11.3548
0
8.03
false
false
false
true
1.306902
0.253842
25.384203
0.343478
9.097511
0.061178
6.117825
0.260906
1.454139
0.377688
4.310938
0.295878
21.764184
false
false
2024-11-30
0
Removed
Norquinal_Echo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Echo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Echo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Echo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Echo
0f7ce5af69a530e87867225d88275de8d3404ad8
13.446736
0
8.03
false
false
false
true
1.341724
0.315791
31.579099
0.353047
10.011496
0.057402
5.740181
0.279362
3.914989
0.380448
6.15599
0.309508
23.278664
false
false
2024-11-30
0
Removed