eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
neopolita_jessi-v0.6-falcon3-7b-instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.6-falcon3-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.6-falcon3-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.6-falcon3-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
neopolita/jessi-v0.6-falcon3-7b-instruct
b0d817e37046e128741e45821f1bc248f7c9d82b
34.548282
other
0
7.456
true
false
false
true
1.278938
0.74019
74.019047
0.550882
35.851322
0.356495
35.649547
0.300336
6.711409
0.490427
22.203385
0.395695
32.854979
false
false
2025-01-20
2025-01-20
1
neopolita/jessi-v0.6-falcon3-7b-instruct (Merge)
neopolita_loki-v0.1-virtuoso_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/neopolita/loki-v0.1-virtuoso" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/loki-v0.1-virtuoso</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__loki-v0.1-virtuoso-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
neopolita/loki-v0.1-virtuoso
4d884bbc57fd00e74772d554449bed7cfccc1c2a
39.196962
apache-2.0
1
14.77
true
false
false
true
3.458283
0.781931
78.193083
0.646725
49.452932
0.339124
33.912387
0.350671
13.422819
0.437531
14.32474
0.512882
45.875813
false
false
2025-01-22
2025-01-23
1
neopolita/loki-v0.1-virtuoso (Merge)
netcat420_DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b
12006d09cf7310f40da990ce9107bffcb2b708df
3.574294
0
7.616
false
false
false
true
0.874278
0.115006
11.500596
0.287678
2.015538
0.001511
0.151057
0.264262
1.901566
0.372385
4.88151
0.108959
0.995493
false
false
2025-01-23
2025-01-23
1
netcat420/DeepSeek-R1-Distill-Qwen-MFANN-Slerp-7b (Merge)
netcat420_DeepSeek-R1-MFANN-TIES-unretrained-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__DeepSeek-R1-MFANN-TIES-unretrained-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b
836a8aa29ce3e9d46c4f1ab2312f20cac9802649
5.947003
0
7.616
false
false
false
true
1.3564
0.258688
25.868806
0.308599
4.561587
0.012085
1.208459
0.255034
0.671141
0.352729
1.757812
0.114528
1.614214
false
false
2025-01-22
2025-01-22
1
netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b (Merge)
netcat420_Llama3.1-MFANN-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Llama3.1-MFANN-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Llama3.1-MFANN-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Llama3.1-MFANN-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Llama3.1-MFANN-8b
6714fe00996d2679e9325b503ab991f4ecc0273d
13.117063
llama3.1
0
8.03
true
false
false
false
1.401012
0.296957
29.695652
0.428115
19.286684
0.029456
2.945619
0.287752
5.033557
0.337906
2.571615
0.272523
19.169252
false
false
2024-12-23
2024-12-23
1
netcat420/Llama3.1-MFANN-8b (Merge)
netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-TIES-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2
0e649dd355ad7d562f9346c96642c24eff35338e
19.213728
apache-2.0
1
8.03
true
false
false
false
1.408226
0.42098
42.097967
0.492376
26.93837
0.076284
7.628399
0.29698
6.263982
0.37276
4.328385
0.352227
28.025266
true
false
2024-11-08
2024-11-09
0
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2
netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-TIES-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3
381cf003a5e28d2b273226364b568cc60b857b5b
19.22203
2
8.03
false
false
false
false
1.441821
0.423802
42.380218
0.491402
26.978851
0.075529
7.55287
0.29698
6.263982
0.374062
4.491146
0.348986
27.665115
false
false
2024-11-25
2024-11-26
1
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3 (Merge)
netcat420_MFANN-Llama3.1-Abliterated-SLERP-V4_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4
af160f1cf089ccbcbf00f99b951797a1f3daeb04
19.399471
apache-2.0
0
8.03
true
false
false
false
1.444935
0.416883
41.688276
0.490897
26.706074
0.067976
6.797583
0.305369
7.38255
0.382094
5.861719
0.351646
27.960624
true
false
2024-11-08
2024-11-09
0
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4
netcat420_MFANN-Llama3.1-Abliterated-SLERP-V5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5
e0502b359816fe3ecd4f7206e5230398604fdfe2
19.493723
2
8.03
false
false
false
false
1.411252
0.432895
43.289472
0.495189
27.367143
0.081571
8.1571
0.293624
5.816555
0.378125
5.165625
0.344498
27.166445
false
false
2024-11-25
2024-11-26
1
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5 (Merge)
netcat420_MFANN-Llama3.1-Abliterated-Slerp-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-Slerp-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES
dbe0a3b69206c042de2b0a96fc156feeecaa49c7
19.248005
2
8.03
false
false
false
false
1.546629
0.429347
42.934746
0.496751
27.599829
0.066465
6.646526
0.291946
5.592841
0.368698
4.58724
0.353142
28.126847
false
false
2024-10-28
2024-10-29
1
netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES (Merge)
netcat420_MFANN-Llama3.1-Abliterated-Slerp-V3.2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-Slerp-V3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2
56abb76e65cbf9dc49af662b09894d119d49705a
19.053719
1
8.03
false
false
false
false
1.482838
0.412811
41.281134
0.497825
27.774394
0.070242
7.024169
0.287752
5.033557
0.375427
5.128385
0.352726
28.080674
false
false
2024-10-28
2024-10-29
1
netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2 (Merge)
netcat420_MFANN-SFT_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-SFT
247f2ce5841d38cef59b73a7f8af857627d254bf
17.932006
2
8.03
false
false
false
false
1.311601
0.368223
36.822298
0.485189
26.208533
0.059668
5.966767
0.316275
8.836689
0.372542
3.801042
0.33361
25.956708
false
false
2024-12-16
2024-12-20
1
netcat420/MFANN-SFT (Merge)
netcat420_MFANN-abliterated-phi2-merge-unretrained_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-abliterated-phi2-merge-unretrained" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-abliterated-phi2-merge-unretrained</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-abliterated-phi2-merge-unretrained-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-abliterated-phi2-merge-unretrained
cfc2d479871655f620dd741d8938b0e4b6df1d3e
9.638779
0
2.775
false
false
false
true
0.895271
0.300504
30.050377
0.410413
17.590366
0.028701
2.870091
0.260906
1.454139
0.318344
0.559635
0.147773
5.308067
false
false
2025-01-15
2025-01-15
1
netcat420/MFANN-abliterated-phi2-merge-unretrained (Merge)
netcat420_MFANN-llama3.1-Abliterated-SLERP_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-Abliterated-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-Abliterated-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-Abliterated-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-Abliterated-SLERP
0c7b2916727e6c28bbca2aa613b8247b66905915
13.881634
1
8.03
false
false
false
false
1.547158
0.259063
25.906262
0.45745
22.280625
0.048338
4.833837
0.27349
3.131991
0.380917
5.714583
0.292803
21.422503
false
false
2024-09-25
2024-10-07
1
netcat420/MFANN-llama3.1-Abliterated-SLERP (Merge)
netcat420_MFANN-llama3.1-abliterated-SLERP-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-SLERP-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-SLERP-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-SLERP-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-SLERP-v3
f90a20024060942826302c30860572c227dd4013
18.042262
llama3.1
1
8.03
true
false
false
false
1.583139
0.379939
37.993856
0.493058
27.18727
0.064199
6.41994
0.291107
5.480984
0.366031
3.053906
0.353059
28.117612
true
false
2024-10-07
2024-10-07
1
netcat420/MFANN-llama3.1-abliterated-SLERP-v3 (Merge)
netcat420_MFANN-llama3.1-abliterated-SLERP-v3.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-SLERP-v3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1
6d306eb66466cb8e1456a36f3895890a117e91e4
18.966233
llama3.1
1
8.03
true
false
false
false
2.661015
0.420155
42.015519
0.492069
27.026316
0.069486
6.94864
0.292785
5.704698
0.368635
3.846094
0.354305
28.256132
true
false
2024-10-08
2024-10-17
1
netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1 (Merge)
netcat420_MFANN-llama3.1-abliterated-v2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-v2
3d0a5d3634726e1a63ac84bee561b346960ca1d7
19.771111
1
8.03
false
false
false
false
1.649047
0.442911
44.291147
0.494083
27.353618
0.074018
7.401813
0.292785
5.704698
0.384542
6.201042
0.349069
27.67435
false
false
2024-10-04
2024-10-07
1
netcat420/MFANN-llama3.1-abliterated-v2 (Merge)
netcat420_MFANN-phigments-slerp-V2_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V2
94596dab22ab78f0d2ec00b8e33c8fa98581ad0f
16.268708
0
2.78
false
false
false
false
0.81624
0.32316
32.316033
0.482728
26.927492
0.031722
3.172205
0.272651
3.020134
0.403729
13.099479
0.271692
19.076906
false
false
2024-10-23
2024-10-26
1
netcat420/MFANN-phigments-slerp-V2 (Merge)
netcat420_MFANN-phigments-slerp-V3.2_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V3.2
3fdb0794f6eb757bf2e4a6f378caed1863e9074c
16.453302
0
2.78
false
false
false
false
0.578308
0.352436
35.243598
0.480855
26.918035
0.033233
3.323263
0.283557
4.474273
0.370771
9.813021
0.270529
18.947621
false
false
2025-02-06
2025-02-06
1
netcat420/MFANN-phigments-slerp-V3.2 (Merge)
netcat420_MFANN-phigments-slerp-V3.3_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V3.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V3.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V3.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V3.3
c05613efa47825622aa16e2b4f881549cdbec997
17.167734
0
2.78
false
false
false
false
0.591477
0.369097
36.909733
0.48953
28.170622
0.033233
3.323263
0.275168
3.355705
0.389219
11.21901
0.280253
20.028073
false
false
2025-02-06
2025-02-06
1
netcat420/MFANN-phigments-slerp-V3.3 (Merge)
netcat420_MFANN3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3b
ba03f833e89c335a5ee8f523a95892a15d22070e
12.652448
mit
0
2.78
true
false
false
false
0.751985
0.252444
25.244352
0.443313
22.239211
0.021903
2.190332
0.291946
5.592841
0.360604
6.142188
0.230552
14.505762
false
false
2024-12-13
2024-12-14
1
netcat420/MFANN3b (Merge)
netcat420_MFANN3bv0.15_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.15" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.15</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.15-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.15
20dbdfb9154cc2f6d43651fc8cea63a120220dc7
11.924555
mit
0
2.78
true
false
false
false
0.934276
0.201211
20.121057
0.453931
23.469347
0.026435
2.643505
0.251678
0.223714
0.395792
8.773958
0.246842
16.315751
false
false
2024-07-04
2024-07-05
0
netcat420/MFANN3bv0.15
netcat420_MFANN3bv0.18_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.18
3e792e3413217b63ea9caa0e8b8595fbeb236a69
12.649876
mit
0
2.78
true
false
false
false
0.965027
0.220645
22.064456
0.451437
23.073404
0.024924
2.492447
0.25755
1.006711
0.402365
10.595573
0.25
16.666667
false
false
2024-07-25
2024-07-25
0
netcat420/MFANN3bv0.18
netcat420_MFANN3bv0.19_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.19
073d42274686f5cb6ef6ff9f6ade24eab198e1f2
12.591489
0
2.78
false
false
false
false
0.972977
0.225815
22.581528
0.45158
22.907055
0.022659
2.265861
0.25755
1.006711
0.402396
9.899479
0.251995
16.888298
false
false
2024-08-04
2024-08-08
0
netcat420/MFANN3bv0.19
netcat420_MFANN3bv0.20_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.20
ac8ba24559cbdb5704d77b602580d911c265fdee
12.572005
mit
0
2.78
true
false
false
false
1.018836
0.219346
21.934578
0.449337
22.790711
0.026435
2.643505
0.259228
1.230425
0.407729
10.166146
0.25
16.666667
false
false
2024-08-29
2024-08-29
2
netcat420/MFANN3bv0.19.12 (Merge)
netcat420_MFANN3bv0.21_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.21
8e78416dce916b69247fa03bd587369d0dade5ed
12.008553
mit
0
2.78
true
false
false
false
1.948678
0.190919
19.091898
0.447002
22.583426
0.031722
3.172205
0.264262
1.901566
0.375948
9.826823
0.239279
15.475399
false
false
2024-09-23
2024-09-24
1
netcat420/MFANN3bv0.21 (Merge)
netcat420_MFANN3bv0.22_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.22" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.22</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.22-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.22
20c26f267ebe62ef1da037a5b840a304cb8d740b
12.256506
mit
0
2.78
true
false
false
false
0.791336
0.197938
19.793814
0.448511
22.491537
0.026435
2.643505
0.261745
1.565996
0.352135
10.183594
0.251745
16.860594
false
false
2024-10-25
2024-10-26
0
netcat420/MFANN3bv0.22
netcat420_MFANN3bv0.23_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.23
93eacd43dcb307016e22a4d9f9f8deef49cd9111
11.448026
0
2.78
false
false
false
false
0.776367
0.204808
20.480769
0.449542
22.696341
0.024924
2.492447
0.251678
0.223714
0.34274
7.042448
0.241772
15.752438
false
false
2024-11-06
2024-11-07
0
netcat420/MFANN3bv0.23
netcat420_MFANN3bv0.24_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.24
55813c2586488a2e7be5883f7e695396f5629d3e
11.810284
mit
0
2.78
true
false
false
false
0.769073
0.220045
22.004504
0.440735
21.545385
0.027946
2.794562
0.258389
1.118568
0.352073
8.375781
0.235206
15.022902
false
false
2024-11-21
2024-11-22
0
netcat420/MFANN3bv0.24
netcat420_MFANN3bv1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.1
2089ba193df157575eae482f0df4907fd3ea14ae
6.659217
0
2.775
false
false
false
true
0.77632
0.250695
25.069482
0.339709
8.391709
0.020393
2.039275
0.266779
2.237136
0.322313
0.455729
0.115858
1.761968
false
false
2025-01-03
2025-01-03
0
netcat420/MFANN3bv1.1
netcat420_MFANN3bv1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.2
0643ded63b35beb43caf2d2c0bd8003fdb81b0ec
8.060475
mit
0
2.775
true
false
false
true
0.799226
0.268605
26.860508
0.365993
11.121792
0.026435
2.643505
0.263423
1.789709
0.315552
0.94401
0.14503
5.003324
false
false
2025-01-21
2025-01-22
1
netcat420/MFANN3bv1.2 (Merge)
netcat420_MFANN3bv1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.3
dab137e547fa2e9f23dbf74ec602acfc6131e5a0
11.533255
0
2.78
false
false
false
false
0.824037
0.254667
25.466651
0.445631
22.637009
0.021148
2.114804
0.25755
1.006711
0.329875
3.801042
0.22756
14.173316
false
false
2025-01-31
2025-02-01
0
netcat420/MFANN3bv1.3
netcat420_MFANN3bv1.4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.4
bed48b1506d7f8866f7d23c89faee4ff76690f5a
16.4976
mit
0
2.78
true
false
false
false
0.658917
0.352436
35.243598
0.480855
26.918035
0.037009
3.700906
0.282718
4.362416
0.370771
9.813021
0.270529
18.947621
false
false
2025-02-06
2025-02-06
1
netcat420/MFANN3bv1.4 (Merge)
netcat420_MFANNv0.19_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.19
af26a25549b7ad291766c479bebda58f15fbff42
14.389066
llama3.1
0
8.03
true
false
false
false
1.914158
0.305674
30.56745
0.473138
24.924106
0.041541
4.154079
0.307047
7.606264
0.352698
2.720573
0.247257
16.361924
false
false
2024-07-27
2024-07-27
0
netcat420/MFANNv0.19
netcat420_MFANNv0.20_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.20
e612e57c933870b8990ac2bc217c434f3ffc84bd
16.461657
llama3.1
0
8.03
true
false
false
false
1.735768
0.347865
34.786478
0.457443
22.401697
0.049849
4.984894
0.290268
5.369128
0.387396
6.757813
0.320229
24.469932
false
false
2024-08-07
2024-08-08
0
netcat420/MFANNv0.20
netcat420_MFANNv0.21_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.21
8c71d0eb419f54c489fa1ddf55d4bd18a1fb27d8
15.886167
llama3
1
8.03
true
false
false
false
1.758821
0.32331
32.330993
0.457637
22.058432
0.057402
5.740181
0.278523
3.803132
0.399333
8.816667
0.303108
22.567598
false
false
2024-08-31
2024-09-02
2
netcat420/MFANNv0.20.12 (Merge)
netcat420_MFANNv0.22.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.22.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.22.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.22.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.22.1
98108142480b802a3e1bb27e3d47075a4ea3a4f1
15.667377
llama3.1
1
8.03
true
false
false
false
1.681057
0.308947
30.894693
0.466089
23.602793
0.053625
5.362538
0.276007
3.467562
0.375302
4.646094
0.334275
26.030585
false
false
2024-10-04
2024-10-05
1
netcat420/MFANNv0.22.1 (Merge)
netcat420_MFANNv0.23_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.23
cf7fb44a8c858602d7fcba58adcbd514c7e08ba4
16.652656
llama3.1
1
8.03
true
false
false
false
1.620759
0.312744
31.274352
0.48981
27.042345
0.049849
4.984894
0.284396
4.58613
0.376792
5.498958
0.338763
26.529255
false
false
2024-10-27
2024-10-29
1
netcat420/MFANNv0.23 (Merge)
netcat420_MFANNv0.24_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.24
57ce382fede1adce68bdb95a386255fa363077d7
16.398374
llama3.1
1
8.03
true
false
false
false
1.487805
0.316241
31.624091
0.479027
25.351725
0.061178
6.117825
0.284396
4.58613
0.375396
4.624479
0.334774
26.085993
false
false
2024-11-07
2024-11-09
1
netcat420/MFANNv0.24 (Merge)
netcat420_MFANNv0.25_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.25
cff1e1772fc7f4f3e68ad53d8589df3f52556e38
16.596965
llama3.1
2
8.03
true
false
false
false
1.424577
0.346666
34.666574
0.479407
25.409784
0.058157
5.81571
0.280201
4.026846
0.368792
3.632292
0.334275
26.030585
false
false
2024-11-25
2024-11-26
1
netcat420/MFANNv0.25 (Merge)
netcat420_Qwen2.5-7B-nerd-uncensored-v0.9-MFANN_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-7B-nerd-uncensored-v0.9-MFANN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN
cc114e017a8d69c0940fe3bdde0f2e1cafeb1078
28.031122
apache-2.0
2
7.616
true
false
false
true
1.389124
0.587841
58.784137
0.523666
32.266986
0.337613
33.761329
0.28104
4.138702
0.392573
6.971615
0.390376
32.263963
false
false
2025-01-02
2025-01-02
0
netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN
netcat420_Qwen2.5-7b-MFANN-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-7b-MFANN-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-7b-MFANN-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-7b-MFANN-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-7b-MFANN-slerp
7760e13b9b6c654ce6c5509f865e4e54f8a00ef6
27.709223
mit
2
7.616
true
false
false
true
1.295558
0.653212
65.321237
0.508873
30.361031
0.287009
28.700906
0.295302
6.040268
0.407302
8.979427
0.341672
26.852467
false
false
2025-01-25
2025-01-25
0
netcat420/Qwen2.5-7b-MFANN-slerp
netcat420_Qwen2.5-7b-nerd-uncensored-MFANN-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-7b-nerd-uncensored-MFANN-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp
c70d4e9569bcde272d74d80e742f7a46ec6d37fc
4.248121
0
7.616
false
false
false
true
1.597224
0.156447
15.644712
0.292011
1.755723
0
0
0.260067
1.342282
0.379177
5.630469
0.11004
1.115544
false
false
2025-01-25
2025-01-25
1
netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp (Merge)
netcat420_Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN
f2fdf326b731948216853ada912b94cd0bc71fb9
25.396742
apache-2.0
2
7.616
true
false
false
true
1.310378
0.574227
57.422749
0.507145
29.98075
0.256798
25.679758
0.292785
5.704698
0.405844
9.630469
0.315658
23.962027
false
false
2024-12-31
2025-01-01
0
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN
netcat420_Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained
f3e8300e7948b878564f1f4de1d98fc03cc18e32
28.058883
1
7.616
false
false
false
true
1.296863
0.648641
64.864116
0.506557
29.939053
0.299094
29.909366
0.298658
6.487696
0.415208
10.134375
0.343168
27.018691
false
false
2025-01-18
2025-01-19
1
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained (Merge)
netcat420_Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b
5d4016402d161c445c5a5982e5783c97bd37d3b2
8.630088
1
7.616
false
false
false
true
1.405099
0.267556
26.755564
0.378902
13.455601
0.018127
1.812689
0.232383
0
0.352792
2.232292
0.167719
7.524379
false
false
2025-01-23
2025-01-23
1
netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b (Merge)
netcat420_Qwen2.5-MFANN-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-MFANN-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-MFANN-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-MFANN-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-MFANN-7b
30757ed2483d24b161febb79bb8f6485bba6cb20
26.18533
1
7.616
false
false
false
true
1.36962
0.609723
60.972331
0.505435
30.29029
0.278701
27.870091
0.286074
4.809843
0.402063
8.357813
0.323305
24.811613
false
false
2025-01-24
2025-01-24
0
netcat420/Qwen2.5-MFANN-7b
netcat420_qwen2.5-MFANN-7b-SLERP-V1.2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/qwen2.5-MFANN-7b-SLERP-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/qwen2.5-MFANN-7b-SLERP-V1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__qwen2.5-MFANN-7b-SLERP-V1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/qwen2.5-MFANN-7b-SLERP-V1.2
85cb895126edc0161c332515811fc26aacfb29ba
28.515189
apache-2.0
0
7.616
true
false
false
true
0.675173
0.660606
66.060608
0.511103
30.830881
0.287009
28.700906
0.29698
6.263982
0.425938
12.142188
0.343833
27.092568
true
false
2025-02-09
2025-02-09
0
netcat420/qwen2.5-MFANN-7b-SLERP-V1.2
netcat420_qwen2.5-MFANN-7b-SLERPv1.1_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/qwen2.5-MFANN-7b-SLERPv1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/qwen2.5-MFANN-7b-SLERPv1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__qwen2.5-MFANN-7b-SLERPv1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/qwen2.5-MFANN-7b-SLERPv1.1
62c8ce3c441cc8c0f7aa89189d008689c39935f9
27.982248
0
7.616
false
false
false
true
1.945655
0.655485
65.548522
0.507476
29.976912
0.296828
29.682779
0.290268
5.369128
0.412635
10.11276
0.34483
27.203384
false
false
2025-02-03
2025-02-03
1
netcat420/qwen2.5-MFANN-7b-SLERPv1.1 (Merge)
netcat420_qwen2.5-MFANN-7b-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/qwen2.5-MFANN-7b-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/qwen2.5-MFANN-7b-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__qwen2.5-MFANN-7b-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/qwen2.5-MFANN-7b-v1.1
7f57ae2d56726223d9ed464abc4f70881eacb701
26.134594
0
7.616
false
false
false
true
0.645924
0.608849
60.884897
0.496664
29.471725
0.282477
28.247734
0.276007
3.467562
0.411396
9.757813
0.324801
24.977837
false
false
2025-02-08
2025-02-09
0
netcat420/qwen2.5-MFANN-7b-v1.1
netease-youdao_Confucius-o1-14B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netease-youdao/Confucius-o1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netease-youdao/Confucius-o1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netease-youdao__Confucius-o1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netease-youdao/Confucius-o1-14B
e6c64e53adbcbbdff9e2114b4f61bd4f2aa1602c
38.527501
apache-2.0
37
14.77
true
false
false
true
3.783277
0.63785
63.784979
0.629977
47.345233
0.431269
43.126888
0.364933
15.324385
0.433813
14.193229
0.526513
47.390293
false
false
2025-01-20
2025-01-27
2
Qwen/Qwen2.5-14B
newsbang_Homer-7B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-7B-v0.1
c953cc313ef5e5029efd057c0d3809a3b8d1cf9f
33.058438
apache-2.0
0
7.616
true
false
false
false
1.381468
0.610872
61.087249
0.560139
37.309227
0.385952
38.595166
0.324664
9.955257
0.435698
12.795573
0.447473
38.608156
false
false
2024-11-14
2024-11-14
0
newsbang/Homer-7B-v0.1
newsbang_Homer-7B-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-7B-v0.2
50b4ca941657ed362f5660aed8274a59a6b3fe2d
33.013958
0
7.616
false
false
false
true
1.349197
0.749383
74.938275
0.551733
36.403486
0.247734
24.773414
0.332215
10.961969
0.42975
13.11875
0.440991
37.887855
false
false
2024-11-15
0
Removed
newsbang_Homer-v0.3-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.3-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.3-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.3-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v0.3-Qwen2.5-7B
4fa38c6c590d8e9bbf2075b2fa9cc37e75cde5d4
31.314789
0
7.616
false
false
false
true
1.171207
0.515401
51.540136
0.548059
36.413677
0.308912
30.891239
0.333893
11.185682
0.474365
19.46224
0.445562
38.395759
false
false
2024-11-18
0
Removed
newsbang_Homer-v0.4-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.4-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.4-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.4-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v0.4-Qwen2.5-7B
e5b73b06e63de7f77845463f8a11c93e82befd15
33.944013
0
7.616
false
false
false
true
1.279441
0.799941
79.994082
0.55331
36.603703
0.277946
27.794562
0.315436
8.724832
0.431083
13.185417
0.436253
37.36148
false
false
2024-11-18
0
Removed
newsbang_Homer-v0.5-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.5-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.5-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.5-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> has been flagged! <a target="_blank" href="https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard/discussions/1022" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">See discussion #1022</a>
newsbang/Homer-v0.5-Qwen2.5-7B
9dc7090b2226f9a2217f593518f734e3246001f9
34.763845
0
7.616
false
false
true
true
1.345168
0.788076
78.807564
0.554018
36.678089
0.372356
37.23565
0.302852
7.04698
0.419302
11.379427
0.436918
37.435358
true
false
2024-11-20
0
Removed
newsbang_Homer-v1.0-Qwen2.5-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v1.0-Qwen2.5-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v1.0-Qwen2.5-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v1.0-Qwen2.5-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v1.0-Qwen2.5-72B
c7f3c5c131c046626f8d33eb615c1a0aba19998b
47.464376
apache-2.0
6
72.706
true
false
false
false
29.548858
0.762772
76.277167
0.73098
62.274065
0.490181
49.018127
0.416107
22.147651
0.467729
17.899479
0.614528
57.16977
false
false
2024-12-16
2024-12-16
0
newsbang/Homer-v1.0-Qwen2.5-72B
newsbang_Homer-v1.0-Qwen2.5-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v1.0-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v1.0-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v1.0-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v1.0-Qwen2.5-7B
4795825dff1b68dd2cc02b3bd39598a161c09c66
32.623576
apache-2.0
2
7.616
true
false
false
false
1.278504
0.639274
63.927379
0.565525
37.810847
0.332326
33.232628
0.322148
9.619687
0.427823
11.877865
0.453457
39.27305
false
false
2024-12-04
2024-12-04
0
newsbang/Homer-v1.0-Qwen2.5-7B
nguyentd_FinancialAdvice-Qwen2.5-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nguyentd/FinancialAdvice-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nguyentd/FinancialAdvice-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nguyentd__FinancialAdvice-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nguyentd/FinancialAdvice-Qwen2.5-7B
5c3421d5a980d0b2365b0d704ead30c9e534a019
21.287932
apache-2.0
1
7.616
true
false
false
false
1.30889
0.449606
44.960593
0.473093
25.630436
0.114804
11.480363
0.294463
5.928412
0.40249
9.144531
0.375249
30.583259
false
false
2024-10-21
2024-11-18
1
nguyentd/FinancialAdvice-Qwen2.5-7B (Merge)
ngxson_MiniThinky-1B-Llama-3.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ngxson/MiniThinky-1B-Llama-3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ngxson/MiniThinky-1B-Llama-3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ngxson__MiniThinky-1B-Llama-3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ngxson/MiniThinky-1B-Llama-3.2
a5e5adf4f7e63f7127a72def90ba3a627bae36bf
6.937116
4
1.236
false
false
false
true
1.122329
0.277148
27.714797
0.314227
4.347795
0.057402
5.740181
0.239094
0
0.343365
2.18724
0.114694
1.632683
false
false
2025-01-06
2025-01-07
1
ngxson/MiniThinky-1B-Llama-3.2 (Merge)
ngxson_MiniThinky-v2-1B-Llama-3.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ngxson/MiniThinky-v2-1B-Llama-3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ngxson/MiniThinky-v2-1B-Llama-3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ngxson__MiniThinky-v2-1B-Llama-3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ngxson/MiniThinky-v2-1B-Llama-3.2
0eb811aca13439292d4151456577a527a2982c46
6.550677
38
1.236
false
false
false
true
1.104686
0.296307
29.630713
0.320511
4.893769
0.028701
2.870091
0.239933
0
0.335615
0.61849
0.111619
1.291002
false
false
2025-01-08
2025-01-09
1
ngxson/MiniThinky-v2-1B-Llama-3.2 (Merge)
nhyha_N3N_Delirium-v1_1030_0227_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_Delirium-v1_1030_0227" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_Delirium-v1_1030_0227</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_Delirium-v1_1030_0227-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_Delirium-v1_1030_0227
41eabc719bd611e2bd0094b0842df84916a57a46
33.094479
apache-2.0
0
10.159
true
false
false
true
4.294387
0.802289
80.228904
0.589069
40.77504
0.210725
21.072508
0.337248
11.63311
0.409812
9.859896
0.414977
34.997414
false
false
2024-10-30
2024-11-04
2
unsloth/gemma-2-9b-it
nhyha_N3N_Llama-3.1-8B-Instruct_1028_0216_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_Llama-3.1-8B-Instruct_1028_0216-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216
d0715a631898112c9c3b729d0334588a2ff636d8
23.479352
apache-2.0
0
8.03
true
false
false
false
1.491432
0.479606
47.960633
0.505374
28.980464
0.170695
17.069486
0.306208
7.494407
0.405031
10.06224
0.36378
29.30888
false
false
2024-10-28
2024-11-04
2
meta-llama/Meta-Llama-3.1-8B
nhyha_N3N_gemma-2-9b-it_20241029_1532_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_gemma-2-9b-it_20241029_1532" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_gemma-2-9b-it_20241029_1532</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_gemma-2-9b-it_20241029_1532-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_gemma-2-9b-it_20241029_1532
6cfc55a717961ef206978b577bd74df97efe1455
32.14813
gemma
2
10.159
true
false
false
false
4.788088
0.675194
67.519404
0.586312
40.986668
0.212236
21.223565
0.340604
12.080537
0.459354
16.385938
0.412234
34.692671
false
false
2024-10-29
2024-11-04
1
unsloth/gemma-2-9b-it
nhyha_N3N_gemma-2-9b-it_20241110_2026_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_gemma-2-9b-it_20241110_2026" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_gemma-2-9b-it_20241110_2026</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_gemma-2-9b-it_20241110_2026-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_gemma-2-9b-it_20241110_2026
2d4c24278ed9d8b42a4035da16a5aea745797441
29.119584
gemma
0
10.159
true
false
false
true
5.0811
0.628283
62.828296
0.586715
40.944106
0.160876
16.087613
0.336409
11.521253
0.407302
9.779427
0.402011
33.556811
false
false
2024-11-12
2024-11-12
1
unsloth/gemma-2-9b-it
nhyha_merge_Qwen2.5-7B-Instruct_20241023_0314_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__merge_Qwen2.5-7B-Instruct_20241023_0314-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314
4d93f65c1f870556f05c77a1ef4f26819d49daf7
31.44955
apache-2.0
0
7.616
true
false
false
false
1.395452
0.569457
56.945682
0.555853
36.365185
0.35423
35.422961
0.321309
9.50783
0.425062
11.099479
0.454205
39.356161
false
false
2024-10-23
2024-11-04
3
Qwen/Qwen2.5-7B
nidum_Nidum-Limitless-Gemma-2B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/nidum/Nidum-Limitless-Gemma-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nidum/Nidum-Limitless-Gemma-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nidum__Nidum-Limitless-Gemma-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nidum/Nidum-Limitless-Gemma-2B
e209e3513d2b34c0e6c433ede26e17604c25cb1a
6.166008
apache-2.0
4
2.506
true
false
false
true
0.793627
0.242351
24.235141
0.30788
3.45106
0.013595
1.359517
0.264262
1.901566
0.374031
4.120573
0.117354
1.928191
false
false
2024-08-02
2024-08-07
0
nidum/Nidum-Limitless-Gemma-2B
nisten_franqwenstein-35b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/franqwenstein-35b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/franqwenstein-35b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__franqwenstein-35b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/franqwenstein-35b
7180aa73e82945a1d2ae0eb304508e21d57e4c27
36.571332
mit
8
34.714
true
false
false
false
10.035539
0.379863
37.986321
0.664658
52.227468
0.340634
34.063444
0.403523
20.469799
0.494021
22.119271
0.573055
52.561687
false
false
2024-10-03
2024-10-03
1
nisten/franqwenstein-35b (Merge)
nisten_franqwenstein-35b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/franqwenstein-35b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/franqwenstein-35b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__franqwenstein-35b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/franqwenstein-35b
901351a987d664a1cd7f483115a167d3ae5694ec
34.451117
mit
8
34.714
true
false
false
true
6.328604
0.391354
39.135383
0.659113
51.680277
0.304381
30.438066
0.35906
14.541387
0.468104
19.679688
0.561087
51.2319
false
false
2024-10-03
2024-10-03
1
nisten/franqwenstein-35b (Merge)
nisten_tqwendo-36b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/tqwendo-36b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/tqwendo-36b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__tqwendo-36b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/tqwendo-36b
c50f38e8421785af4b8596f81e0098a6585b4f05
37.04172
mit
9
35.69
true
false
false
false
18.300783
0.677767
67.776721
0.643183
49.414936
0.415408
41.540785
0.331376
10.850112
0.442958
15.103125
0.438082
37.564642
false
false
2024-12-21
2024-12-21
1
nisten/tqwendo-36b (Merge)
nlpguy_Lion-Lamarck-v.1.0.8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Lion-Lamarck-v.1.0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Lion-Lamarck-v.1.0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Lion-Lamarck-v.1.0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Lion-Lamarck-v.1.0.8
3f1c2632893f4a7d22ab50a1b87ebee9f054086f
35.883025
2
14.766
false
false
false
true
4.057619
0.450905
45.090471
0.586893
40.848441
0.554381
55.438066
0.358221
14.42953
0.467271
19.008854
0.464345
40.482787
false
false
2025-01-27
2025-01-27
1
nlpguy/Lion-Lamarck-v.1.0.8 (Merge)
nlpguy_Lion-Lamarck-v.1.0.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Lion-Lamarck-v.1.0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Lion-Lamarck-v.1.0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Lion-Lamarck-v.1.0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Lion-Lamarck-v.1.0.9
2bb3b70d5eab9fbc39a10452c601fe36d00b9fca
36.365578
1
14.766
false
false
false
false
3.906622
0.340895
34.089549
0.591824
40.468848
0.564199
56.41994
0.390101
18.680089
0.529958
27.378125
0.470412
41.156915
false
false
2025-01-28
2025-01-28
1
nlpguy/Lion-Lamarck-v.1.0.9 (Merge)
nlpguy_Lion-Lamarck-v.1.1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Lion-Lamarck-v.1.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Lion-Lamarck-v.1.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Lion-Lamarck-v.1.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Lion-Lamarck-v.1.1.0
dedb68b932cb4bbf50d80150419fdca664ba63e5
37.031613
1
14.766
false
false
false
false
3.887952
0.365775
36.577503
0.596246
41.166304
0.575529
57.55287
0.392617
19.01566
0.532531
27.533073
0.463098
40.344267
false
false
2025-01-29
2025-01-30
1
nlpguy/Lion-Lamarck-v.1.1.0 (Merge)
nlpguy_Miisce-one_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Miisce-one" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Miisce-one</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Miisce-one-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Miisce-one
c32e07ad0f4d9dcc0f0806c7b6a6dc013e0a5cfe
39.837021
2
14.766
false
false
false
false
2.018793
0.606576
60.657611
0.650456
49.711682
0.416918
41.691843
0.385906
18.120805
0.48199
19.815365
0.541223
49.024823
false
false
2025-02-10
2025-02-10
1
nlpguy/Miisce-one (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v1
9e6d747cbb81e1f25915a0f42802cbeb85b61c3e
10.990223
other
0
12.451
true
false
false
false
5.868612
0.16484
16.48404
0.4468
22.06891
0.01435
1.435045
0.280201
4.026846
0.380354
4.844271
0.25374
17.082225
true
false
2024-09-29
2024-09-29
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v1 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v2
4ac077e496705687fdcbe51f3b915be42e91bf79
8.345444
other
1
12.451
true
false
false
false
5.849237
0.157272
15.727159
0.394967
14.382673
0.01284
1.283988
0.27349
3.131991
0.379083
5.252083
0.192653
10.29477
true
false
2024-09-29
2024-09-29
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v2 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v3
6703b09d3d78cc020448ee93c53dc727312bcbaf
5.202259
other
1
12.451
true
false
false
false
9.086727
0.14121
14.120977
0.305245
3.398266
0.011329
1.132931
0.259228
1.230425
0.409844
9.430469
0.117104
1.900488
true
false
2024-10-04
2024-10-04
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v3 (Merge)
nlpguy_StableProse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/StableProse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/StableProse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__StableProse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/StableProse
4937dc747684705e4b87df27b47eab5429f3a9c1
16.623905
1
12.248
false
false
false
false
3.588726
0.197239
19.723888
0.511656
30.180203
0.064955
6.495468
0.302852
7.04698
0.406708
8.871875
0.346825
27.425015
false
false
2024-08-16
2024-08-17
1
nlpguy/StableProse (Merge)
nlpguy_StarFusion-alpha1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/StarFusion-alpha1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/StarFusion-alpha1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__StarFusion-alpha1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/StarFusion-alpha1
dccad965a710d7bee001b6387c8307e7c320291e
20.828324
apache-2.0
1
7.242
true
false
false
true
1.756997
0.566009
56.60093
0.442869
21.933182
0.071752
7.175227
0.295302
6.040268
0.408104
8.879688
0.319066
24.340647
true
false
2024-04-13
2024-06-26
1
nlpguy/StarFusion-alpha1 (Merge)
noname0202_Llama-3.2-4x3B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/Llama-3.2-4x3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/Llama-3.2-4x3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__Llama-3.2-4x3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/Llama-3.2-4x3B-Instruct
b7db5c4ec1138be364127e0482adabc8355d0943
24.010127
0
9.949
false
false
false
true
2.384502
0.706718
70.671817
0.464731
24.689909
0.15861
15.861027
0.272651
3.020134
0.367396
4.424479
0.328541
25.393395
false
false
2025-01-26
0
Removed
noname0202_gemma-2-2b-it-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/gemma-2-2b-it-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/gemma-2-2b-it-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__gemma-2-2b-it-ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/gemma-2-2b-it-ties
7ab51f4991186f6850d826e4ddc44a053de05f2f
10.063823
0
2.614
false
false
false
true
2.405395
0.126571
12.657083
0.420574
18.139569
0.024169
2.416918
0.270134
2.684564
0.392885
7.14401
0.256067
17.340795
false
false
2025-01-29
0
Removed
noname0202_gemma-2-9b-sft-jp-en-zh-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/gemma-2-9b-sft-jp-en-zh-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/gemma-2-9b-sft-jp-en-zh-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__gemma-2-9b-sft-jp-en-zh-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/gemma-2-9b-sft-jp-en-zh-v1
06ca7f00ce3ddece15cb50a1292ce0912e19af4e
16.851966
0
9.242
false
false
false
true
1.831472
0.298805
29.880495
0.451929
22.00024
0.089124
8.912387
0.307047
7.606264
0.40801
9.101302
0.3125
23.611111
false
false
2025-01-05
0
Removed
noname0202_gemma-2-9b-sft-jp-en-zh-v2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/gemma-2-9b-sft-jp-en-zh-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/gemma-2-9b-sft-jp-en-zh-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__gemma-2-9b-sft-jp-en-zh-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/gemma-2-9b-sft-jp-en-zh-v2
b32a4038b8617c7620ee7761609d926ddda8c1fe
19.13082
0
9.242
false
false
false
true
1.625545
0.399347
39.934707
0.451504
22.658059
0.10423
10.422961
0.287752
5.033557
0.361156
7.011198
0.36752
29.724439
false
false
2025-01-05
0
Removed
noname0202_llama-math-1b-r16-0to512tokens-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r16-0to512tokens-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r16-0to512tokens-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r16-0to512tokens-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r16-0to512tokens-test
df274b741781f3f3ecce2ef86883863aaeb71c58
13.804072
0
1.236
false
false
false
true
0.733068
0.546975
54.697536
0.348842
8.389239
0.081571
8.1571
0.266779
2.237136
0.314313
1.255729
0.172789
8.087692
false
false
2025-01-25
0
Removed
noname0202_llama-math-1b-r32-0to512tokens-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r32-0to512tokens-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r32-0to512tokens-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r32-0to512tokens-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r32-0to512tokens-test
200ae9a8db697345c6471e12a225f2e7adb953c1
14.371257
0
1.236
false
false
false
true
0.800929
0.568258
56.825778
0.349518
8.1919
0.090634
9.063444
0.265101
2.013423
0.320948
1.685156
0.176031
8.447843
false
false
2025-01-24
0
Removed
noname0202_llama-math-1b-r32-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r32-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r32-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r32-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r32-test
3b2cd2f41ed1a9894dd1bdd275f45081d5c6caf1
14.418127
0
1.236
false
false
false
true
0.746796
0.581922
58.192152
0.348596
8.498755
0.072508
7.250755
0.261745
1.565996
0.315646
2.322396
0.178108
8.678709
false
false
2025-01-24
0
Removed
noname0202_llama-math-1b-r8-512tokens-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r8-512tokens-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r8-512tokens-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r8-512tokens-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r8-512tokens-test
ba83c9aca75dd38a66df90eb2dd1cb56db6d3c9a
14.630752
0
1.236
false
false
false
true
0.731721
0.579199
57.919875
0.349576
8.396798
0.081571
8.1571
0.268456
2.46085
0.316948
2.485156
0.175283
8.364731
false
false
2025-01-24
0
Removed
notbdq_Qwen2.5-14B-Instruct-1M-GRPO-Reasoning_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/notbdq__Qwen2.5-14B-Instruct-1M-GRPO-Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning
8322cabcc11053dca1f6fac6f3ffac4781ec9641
41.559027
4
14.77
false
false
false
true
3.391752
0.841356
84.135649
0.619822
45.658281
0.530211
53.021148
0.343121
12.416107
0.418
11.35
0.484957
42.772976
false
false
2025-02-01
2025-02-05
0
notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning
nothingiisreal_L3.1-8B-Celeste-V1.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/L3.1-8B-Celeste-V1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/L3.1-8B-Celeste-V1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__L3.1-8B-Celeste-V1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/L3.1-8B-Celeste-V1.5
e7ea0e3d2727c8cf66c0481ffa251f28cb85429f
26.172146
llama3.1
39
8.03
true
false
false
true
1.414328
0.732672
73.267153
0.50118
28.887967
0.146526
14.652568
0.284396
4.58613
0.374865
5.591406
0.370429
30.047651
false
false
2024-07-27
2024-12-04
0
nothingiisreal/L3.1-8B-Celeste-V1.5
nothingiisreal_MN-12B-Starcannon-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/MN-12B-Starcannon-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__MN-12B-Starcannon-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/MN-12B-Starcannon-v2
f2ff756e8c32d9107d4f6a3c18c730e3fe0cae88
18.18145
apache-2.0
7
12.248
true
false
false
true
3.445326
0.392527
39.252738
0.50045
28.424783
0.059668
5.966767
0.278523
3.803132
0.397812
7.993229
0.312832
23.64805
true
false
2024-08-13
2024-09-03
1
nothingiisreal/MN-12B-Starcannon-v2 (Merge)
nothingiisreal_MN-12B-Starcannon-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/MN-12B-Starcannon-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__MN-12B-Starcannon-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/MN-12B-Starcannon-v3
169480b62121c4f070e93a05158545c679712644
19.144471
13
12.248
false
false
false
true
3.491342
0.380738
38.073755
0.517055
30.873002
0.077795
7.779456
0.27349
3.131991
0.404635
9.846094
0.326463
25.16253
false
false
2024-08-13
2024-09-03
1
nothingiisreal/MN-12B-Starcannon-v3 (Merge)
nvidia_AceInstruct-1.5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceInstruct-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceInstruct-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceInstruct-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceInstruct-1.5B
1e3d02075fcf988407b436eb5c10a407be86c71f
18.115865
cc-by-nc-4.0
18
1.777
true
false
false
true
1.706408
0.394776
39.477586
0.393196
15.468561
0.312689
31.268882
0.271812
2.908277
0.346
2.083333
0.257397
17.488549
false
true
2025-01-15
2025-01-24
0
nvidia/AceInstruct-1.5B
nvidia_AceInstruct-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceInstruct-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceInstruct-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceInstruct-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceInstruct-72B
3c5c88ea8ab5d5067e23a482cc26014a0d23e848
40.405022
cc-by-nc-4.0
14
72.706
true
false
false
true
80.786695
0.711889
71.18889
0.613904
44.20382
0.626133
62.613293
0.321309
9.50783
0.420604
11.875521
0.487367
43.04078
false
true
2025-01-15
2025-01-24
0
nvidia/AceInstruct-72B
nvidia_AceInstruct-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceInstruct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceInstruct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceInstruct-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceInstruct-7B
3bbb14f63afd2dc890c7932bfffb4f6dc3bfa1e8
33.056543
cc-by-nc-4.0
19
7.616
true
false
false
true
1.850145
0.542229
54.222906
0.550118
36.574814
0.529456
52.945619
0.307047
7.606264
0.4255
11.6875
0.417719
35.302157
false
true
2025-01-15
2025-01-24
0
nvidia/AceInstruct-7B
nvidia_AceMath-1.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-1.5B-Instruct
166818c371eaafb212b243aecadd50b1079fa776
20.18986
cc-by-nc-4.0
7
1.777
true
false
false
true
1.663243
0.321237
32.123654
0.40243
16.76251
0.528701
52.870091
0.274329
3.243848
0.360698
4.320573
0.206366
11.818484
false
true
2025-01-13
2025-01-24
0
nvidia/AceMath-1.5B-Instruct
nvidia_AceMath-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-72B-Instruct
9bab369176cddd6cbc38b2002ffbef9a3152aade
36.655604
cc-by-nc-4.0
14
72.706
true
false
false
true
88.011236
0.494993
49.499328
0.640216
48.687772
0.714502
71.450151
0.270973
2.796421
0.406156
9.602865
0.441074
37.897089
false
true
2025-01-14
2025-01-24
0
nvidia/AceMath-72B-Instruct
nvidia_AceMath-72B-RM_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForSequenceClassification
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-72B-RM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-72B-RM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-72B-RM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-72B-RM
bb8cb2e7bd45c1d74894d87a95249d5dd5c19bf4
3.428827
cc-by-nc-4.0
8
71.461
true
false
false
true
150.000724
0.14126
14.125964
0.271743
1.403502
0
0
0.23406
0
0.335146
3.059896
0.117852
1.983599
false
true
2025-01-14
2025-01-24
0
nvidia/AceMath-72B-RM
nvidia_AceMath-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-7B-Instruct
f29b4bd5ad5e4fc7bfb52343dca2dd07e948f964
30.32749
cc-by-nc-4.0
21
7.616
true
false
false
true
1.977308
0.453178
45.317757
0.499385
29.993826
0.633686
63.36858
0.291946
5.592841
0.419271
11.208854
0.338348
26.483082
false
true
2025-01-13
2025-01-24
0
nvidia/AceMath-7B-Instruct
nvidia_AceMath-7B-RM_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForSequenceClassification
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-7B-RM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-7B-RM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-7B-RM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-7B-RM
2a7b81019f94d1a78eec298f7cf5c677ff958f5a
3.22439
cc-by-nc-4.0
6
7.071
true
false
false
true
1.354279
0.149378
14.937809
0.242269
0.251527
0
0
0.245805
0
0.358
2.616667
0.113863
1.540337
false
true
2025-01-14
2025-01-24
0
nvidia/AceMath-7B-RM
nvidia_Hymba-1.5B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
HymbaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Hymba-1.5B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Hymba-1.5B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Hymba-1.5B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Hymba-1.5B-Base
85e5b833d75f26170c7684ba83140f1bf9fedf37
8.035282
other
139
1.523
true
false
false
false
18.215829
0.229512
22.951214
0.325648
7.689941
0.013595
1.359517
0.255872
0.782998
0.356635
5.179427
0.192237
10.248596
false
true
2024-10-09
2024-12-06
0
nvidia/Hymba-1.5B-Base