eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Norquinal_Foxtrot_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Foxtrot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Foxtrot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Foxtrot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Foxtrot
b102325f40a526f4a020c1e12322a1e7aeebb988
13.353624
0
8.03
false
false
false
true
1.273005
0.301153
30.115316
0.355803
10.170273
0.058157
5.81571
0.286913
4.9217
0.380417
6.31875
0.30502
22.779994
false
false
2024-11-30
0
Removed
Norquinal_Golf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Golf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Golf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Golf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Golf
5b1a3ad20fd4f7915fcc8eaf5e04d4c5f996e70b
13.926818
0
8.03
false
false
false
true
1.373042
0.35336
35.33602
0.353326
9.898589
0.053625
5.362538
0.290268
5.369128
0.338
4.75
0.305602
22.844637
false
false
2024-11-30
0
Removed
Norquinal_Hotel_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Norquinal/Hotel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Norquinal/Hotel</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Norquinal__Hotel-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Norquinal/Hotel
e359329defd5ebbd70c4df759f231d7f9a87364a
13.283272
0
8.03
false
false
false
true
1.408775
0.321511
32.151137
0.367857
11.514937
0.05287
5.287009
0.279362
3.914989
0.328823
2.869531
0.315658
23.962027
false
false
2024-11-30
0
Removed
NotASI_FineTome-Llama3.2-1B-0929_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-Llama3.2-1B-0929" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-Llama3.2-1B-0929</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-Llama3.2-1B-0929-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-Llama3.2-1B-0929
61c8742238d0cfe68a0a3f61326b84cd6624ad02
9.953181
llama3.2
1
1.236
true
false
false
true
0.930817
0.399072
39.907224
0.324627
5.741405
0.036254
3.625378
0.272651
3.020134
0.34876
2.661719
0.142869
4.763224
false
false
2024-09-29
2024-10-04
2
meta-llama/Llama-3.2-1B-Instruct
NotASI_FineTome-Llama3.2-3B-1002_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-Llama3.2-3B-1002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-Llama3.2-3B-1002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-Llama3.2-3B-1002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-Llama3.2-3B-1002
7c8497a24a381e3bfd77bc92e5685442768790d0
16.7624
llama3.2
1
3
true
false
false
true
2.130901
0.54745
54.744966
0.431947
19.520061
0.062689
6.268882
0.250839
0.111857
0.36851
3.963802
0.243684
15.964835
false
false
2024-10-04
2024-10-05
2
meta-llama/Llama-3.2-3B-Instruct
NotASI_FineTome-v1.5-Llama3.2-1B-1007_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-v1.5-Llama3.2-1B-1007" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-v1.5-Llama3.2-1B-1007</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-v1.5-Llama3.2-1B-1007-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-v1.5-Llama3.2-1B-1007
5e329d987e9f74dd2703a4fefa56ab8c72b5702b
9.24257
llama3.2
1
1.236
true
false
false
true
0.948463
0.392378
39.237778
0.324057
5.801725
0.031722
3.172205
0.25
0
0.347458
2.498958
0.142703
4.744755
true
false
2024-10-07
2024-10-07
1
NotASI/FineTome-v1.5-Llama3.2-1B-1007 (Merge)
NotASI_FineTome-v1.5-Llama3.2-3B-1007_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-v1.5-Llama3.2-3B-1007" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-v1.5-Llama3.2-3B-1007</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-v1.5-Llama3.2-3B-1007-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-v1.5-Llama3.2-3B-1007
6c6e71fbcff6c00d04a3fd69084af20bf2a943c8
17.113696
llama3.2
1
3.213
true
false
false
true
1.450757
0.550772
55.077195
0.431237
19.457219
0.064199
6.41994
0.261745
1.565996
0.364542
4.067708
0.244847
16.094119
true
false
2024-10-07
2024-10-07
1
NotASI/FineTome-v1.5-Llama3.2-3B-1007 (Merge)
NousResearch_DeepHermes-3-Mistral-24B-Preview_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/DeepHermes-3-Mistral-24B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/DeepHermes-3-Mistral-24B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__DeepHermes-3-Mistral-24B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/DeepHermes-3-Mistral-24B-Preview
48072dc6c0594a3198eb862c13613c4ab1119009
31.987014
apache-2.0
76
23.572
true
false
false
true
1.459075
0.453578
45.357762
0.64882
48.963404
0.257553
25.755287
0.369966
15.995526
0.450333
15.958333
0.459026
39.89177
false
true
2025-03-02
2025-03-13
1
mistralai/Mistral-Small-24B-Base-2501
NousResearch_Hermes-2-Pro-Llama-3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-2-Pro-Llama-3-8B
bc265d1781299ed2045214289c927c207439a729
22.069976
llama3
419
8.031
true
false
false
true
1.499966
0.536184
53.618399
0.507113
30.667993
0.083837
8.383686
0.292785
5.704698
0.42624
11.246615
0.305186
22.798463
false
true
2024-04-30
2024-06-13
1
NousResearch/Meta-Llama-3-8B
NousResearch_Hermes-2-Pro-Mistral-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Pro-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Pro-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-2-Pro-Mistral-7B
09317b1d8da639b5d9af77c06aa17cde0f0f91c0
21.840577
apache-2.0
491
7.242
true
false
false
true
0.945595
0.566834
56.683378
0.499544
29.427579
0.060423
6.042296
0.27349
3.131991
0.437594
14.132552
0.294631
21.625665
false
true
2024-03-11
2024-06-12
1
mistralai/Mistral-7B-v0.1
NousResearch_Hermes-2-Theta-Llama-3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Theta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Theta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-2-Theta-Llama-3-8B
885173e97ab8572b444f7db1290d5d0386e26816
24.788376
apache-2.0
201
8.03
true
false
false
true
1.487845
0.651788
65.178837
0.520667
32.046074
0.096677
9.667674
0.303691
7.158837
0.394896
8.361979
0.336852
26.316859
false
true
2024-05-05
2024-07-11
2
NousResearch/Meta-Llama-3-8B
NousResearch_Hermes-3-Llama-3.1-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.1-70B
093242c69a91f8d9d5b8094c380b88772f9bd7f8
38.514771
llama3
107
70.554
true
false
false
true
22.415782
0.766144
76.614383
0.675578
53.765409
0.20997
20.996979
0.361577
14.876957
0.494896
23.428646
0.472656
41.40625
false
true
2024-07-29
2024-08-28
1
meta-llama/Meta-Llama-3.1-70B
NousResearch_Hermes-3-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.1-8B
aabb745a717e133b74dcae23195d2635cf5f38cc
23.490877
llama3
303
8.03
true
false
false
true
0.905808
0.617017
61.701729
0.517745
30.724097
0.047583
4.758308
0.297819
6.375839
0.436938
13.617187
0.313913
23.7681
false
true
2024-07-28
2024-08-28
1
meta-llama/Meta-Llama-3.1-8B
NousResearch_Hermes-3-Llama-3.2-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.2-3B
f6a109fe836b13b6905f8c16a7388f2f557c3974
15.242119
llama3
148
3.213
true
false
false
true
1.652384
0.382486
38.248625
0.435199
20.187188
0.039275
3.927492
0.275168
3.355705
0.403021
8.577604
0.254405
17.156102
false
true
2024-12-03
2024-12-20
1
Removed
NousResearch_Nous-Hermes-2-Mistral-7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-Mistral-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-Mistral-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-Mistral-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-Mistral-7B-DPO
ebec0a691037d38955727d6949798429a63929dd
21.100587
apache-2.0
175
7.242
true
false
false
true
0.949198
0.576251
57.625101
0.485265
27.792546
0.047583
4.758308
0.292785
5.704698
0.399979
8.330729
0.301529
22.392139
false
true
2024-02-18
2024-06-12
1
mistralai/Mistral-7B-v0.1
NousResearch_Nous-Hermes-2-Mixtral-8x7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
286ae6737d048ad1d965c2e830864df02db50f2f
27.35319
apache-2.0
429
46.703
true
true
false
true
15.653111
0.58969
58.96898
0.553885
37.107784
0.122356
12.23565
0.321309
9.50783
0.459542
16.676042
0.366606
29.622858
false
true
2024-01-11
2024-07-27
1
mistralai/Mixtral-8x7B-v0.1
NousResearch_Nous-Hermes-2-Mixtral-8x7B-SFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-Mixtral-8x7B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT
4c06af2684730f75a6874b95e8bf6058105d9612
21.841011
apache-2.0
56
46.703
true
true
false
true
20.77588
0.573078
57.307832
0.505787
30.594313
0.021148
2.114804
0.302013
6.935123
0.421375
11.138542
0.306599
22.955452
false
true
2023-12-26
2024-06-12
1
mistralai/Mixtral-8x7B-v0.1
NousResearch_Nous-Hermes-2-SOLAR-10.7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-SOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-SOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-SOLAR-10.7B
14c1fbe2f71acdcd58247b30d5439bd572d52386
23.412543
apache-2.0
205
10.732
true
false
false
true
1.286888
0.527866
52.786606
0.541429
34.990895
0.057402
5.740181
0.293624
5.816555
0.437281
13.826823
0.345828
27.314199
false
true
2024-01-01
2024-06-12
1
upstage/SOLAR-10.7B-v1.0
NousResearch_Nous-Hermes-llama-2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-llama-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-llama-2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-llama-2-7b
b7c3ec54b754175e006ef75696a2ba3802697078
9.316716
mit
70
6.738
true
false
false
false
5.116114
0.172908
17.290788
0.382394
13.78942
0.009063
0.906344
0.263423
1.789709
0.425719
11.68151
0.193983
10.442524
false
true
2023-07-25
2024-06-12
0
NousResearch/Nous-Hermes-llama-2-7b
NousResearch_Yarn-Llama-2-13b-128k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-13b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-13b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-13b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Llama-2-13b-128k
4e3e87a067f64f8814c83dd5e3bad92dcf8a2391
8.494147
112
13
true
false
false
false
103.871567
0.165464
16.54643
0.382682
13.505319
0.017372
1.73716
0.258389
1.118568
0.34575
3.385417
0.232048
14.671986
false
true
2023-08-30
2024-06-13
0
NousResearch/Yarn-Llama-2-13b-128k
NousResearch_Yarn-Llama-2-7b-128k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-7b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-7b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-7b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Llama-2-7b-128k
e1ceedbbf2ed28b88086794441a6c05606d15437
6.814801
39
7
true
false
false
false
1.679478
0.148478
14.847826
0.324803
6.144692
0.015106
1.510574
0.260067
1.342282
0.396698
8.253906
0.179106
8.789524
false
true
2023-08-31
2024-06-13
0
NousResearch/Yarn-Llama-2-7b-128k
NousResearch_Yarn-Llama-2-7b-64k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-7b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-7b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-7b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Llama-2-7b-64k
08491431ac3b50add7443f5d4c02850801d877be
7.222883
23
7
true
false
false
false
1.660803
0.169986
16.998564
0.332628
7.044055
0.015861
1.586103
0.264262
1.901566
0.393875
6.934375
0.179854
8.872636
false
true
2023-08-30
2024-06-13
0
NousResearch/Yarn-Llama-2-7b-64k
NousResearch_Yarn-Mistral-7b-128k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Mistral-7b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Mistral-7b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Mistral-7b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Mistral-7b-128k
d09f1f8ed437d61c1aff94c1beabee554843dcdd
13.268755
apache-2.0
573
7
true
false
false
false
1.100522
0.193367
19.336693
0.431447
20.633112
0.031722
3.172205
0.298658
6.487696
0.407052
8.948177
0.289312
21.034648
false
true
2023-10-31
2024-06-12
0
NousResearch/Yarn-Mistral-7b-128k
NousResearch_Yarn-Mistral-7b-64k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Mistral-7b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Mistral-7b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Mistral-7b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Mistral-7b-64k
0273c624561fcecc8e8f4030492a9307aa60f945
13.540458
apache-2.0
51
7
true
false
false
false
1.08232
0.207955
20.795489
0.429319
20.2302
0.037009
3.700906
0.290268
5.369128
0.412385
9.88151
0.29139
21.265514
false
true
2023-10-31
2024-06-12
0
NousResearch/Yarn-Mistral-7b-64k
NousResearch_Yarn-Solar-10b-32k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Solar-10b-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Solar-10b-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Solar-10b-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Solar-10b-32k
ec3158b5276ac6644ddbdb36ccf6f9a106c98ede
15.721261
apache-2.0
10
10
true
false
false
false
2.030196
0.194216
19.421579
0.498686
28.994824
0.030211
3.021148
0.302852
7.04698
0.414646
10.597396
0.327211
25.245641
false
true
2024-01-17
2024-06-12
0
NousResearch/Yarn-Solar-10b-32k
NousResearch_Yarn-Solar-10b-64k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Solar-10b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Solar-10b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Solar-10b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Solar-10b-64k
703818628a5e8ef637e48e8dbeb3662aa0497aff
15.16205
apache-2.0
15
10
true
false
false
false
1.527506
0.198887
19.888673
0.492199
28.395714
0.028701
2.870091
0.302013
6.935123
0.401438
9.013021
0.314827
23.869681
false
true
2024-01-17
2024-06-12
0
NousResearch/Yarn-Solar-10b-64k
Novaciano_ASTAROTH-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/ASTAROTH-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/ASTAROTH-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__ASTAROTH-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/ASTAROTH-3.2-1B
fe31207a8150fed8d9c68cf21ab7f0d62efb4b01
14.173861
0
1.498
false
false
false
true
0.348855
0.561288
56.128849
0.354296
9.493518
0.073263
7.326284
0.255872
0.782998
0.314219
1.210677
0.190908
10.100842
false
false
2025-03-09
2025-03-10
1
Novaciano/ASTAROTH-3.2-1B (Merge)
Novaciano_BLAST_PROCESSING-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/BLAST_PROCESSING-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/BLAST_PROCESSING-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__BLAST_PROCESSING-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/BLAST_PROCESSING-3.2-1B
2d1b240529812f2fff4d9a42b845b9c4031a1624
11.950699
0
1.498
false
false
false
false
0.352838
0.392178
39.217831
0.346032
9.362854
0.074773
7.477341
0.26594
2.12528
0.335146
3.059896
0.194149
10.460993
false
false
2025-03-06
2025-03-09
1
Novaciano/BLAST_PROCESSING-3.2-1B (Merge)
Novaciano_Cerberus-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/Cerberus-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/Cerberus-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__Cerberus-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/Cerberus-3.2-1B
45223673222648d0247058e8839826bf0e7596fd
13.731435
0
1.236
false
false
false
true
0.355357
0.501688
50.168774
0.416494
16.974159
0.058157
5.81571
0.258389
1.118568
0.328885
0.94401
0.166307
7.367391
false
false
2025-03-09
2025-03-10
1
Novaciano/Cerberus-3.2-1B (Merge)
Novaciano_Cultist-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/Cultist-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/Cultist-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__Cultist-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/Cultist-3.2-1B
59c2c68c6a47d529be97641e451e31ba90b1bc31
12.939674
0
1.498
false
false
false
true
0.737058
0.52949
52.948953
0.339931
7.52004
0.058912
5.891239
0.260906
1.454139
0.33301
1.892969
0.171376
7.930703
false
false
2025-03-09
2025-03-10
1
Novaciano/Cultist-3.2-1B (Merge)
Novaciano_FuseChat-3.2-1B-GRPO_Creative_RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__FuseChat-3.2-1B-GRPO_Creative_RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP
be2ceb27b58c77ec4578ffc9de784a16be7a6e4c
13.847576
0
1.236
false
false
false
true
0.344135
0.559815
55.981463
0.348782
8.942707
0.08006
8.006042
0.255872
0.782998
0.332885
1.210677
0.173454
8.161569
false
false
2025-03-03
2025-03-10
1
Novaciano/FuseChat-3.2-1B-GRPO_Creative_RP (Merge)
Novaciano_Fusetrix-3.2-1B-GRPO_RP_Creative_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__Fusetrix-3.2-1B-GRPO_RP_Creative-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative
0a68a513f48c7d34467e4309108eabb676d48498
13.986209
0
1.236
false
false
false
true
0.331542
0.536634
53.663391
0.34346
8.77211
0.114804
11.480363
0.25
0
0.320917
1.58125
0.175781
8.420139
false
false
2025-03-03
2025-03-10
1
Novaciano/Fusetrix-3.2-1B-GRPO_RP_Creative (Merge)
Novaciano_Fusetrix-Dolphin-3.2-1B-GRPO_Creative_RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/Fusetrix-Dolphin-3.2-1B-GRPO_Creative_RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/Fusetrix-Dolphin-3.2-1B-GRPO_Creative_RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__Fusetrix-Dolphin-3.2-1B-GRPO_Creative_RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/Fusetrix-Dolphin-3.2-1B-GRPO_Creative_RP
925f3903592424e6570aab3ea820fdddd8c0f553
14.279019
0
1.236
false
false
false
true
0.325822
0.534286
53.42857
0.350239
9.680799
0.104985
10.498489
0.268456
2.46085
0.318313
0.455729
0.182347
9.149675
false
false
2025-03-03
2025-03-10
1
Novaciano/Fusetrix-Dolphin-3.2-1B-GRPO_Creative_RP (Merge)
Novaciano_HarmfulProject-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/HarmfulProject-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/HarmfulProject-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__HarmfulProject-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/HarmfulProject-3.2-1B
30791fe7304bfdb84995baf252e141a9d81fc496
10.68688
1
1.498
false
false
false
false
0.367758
0.387382
38.738215
0.32745
6.512805
0.047583
4.758308
0.266779
2.237136
0.341875
2.734375
0.182264
9.14044
false
false
2025-03-08
2025-03-09
1
Novaciano/HarmfulProject-3.2-1B (Merge)
Novaciano_LEWD-Mental-Cultist-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/LEWD-Mental-Cultist-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/LEWD-Mental-Cultist-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__LEWD-Mental-Cultist-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/LEWD-Mental-Cultist-3.2-1B
5ff3d4ebc04b8e9ea8f739ba2ce744ed7077c16f
12.977294
0
1.498
false
false
false
true
0.36366
0.530864
53.086366
0.351272
8.636854
0.05287
5.287009
0.256711
0.894855
0.322281
1.41849
0.176862
8.540189
false
false
2025-03-09
2025-03-10
1
Novaciano/LEWD-Mental-Cultist-3.2-1B (Merge)
Novaciano_La_Mejor_Mezcla-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/La_Mejor_Mezcla-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/La_Mejor_Mezcla-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__La_Mejor_Mezcla-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/La_Mejor_Mezcla-3.2-1B
ee2ad8591fd4b90a7d995decdfd097f7ed2e2a06
14.056697
apache-2.0
0
1.498
true
false
false
true
0.32456
0.550997
55.099691
0.348794
9.413059
0.089879
8.987915
0.25755
1.006711
0.319615
0.61849
0.182929
9.214317
true
false
2025-03-03
2025-03-10
1
Novaciano/La_Mejor_Mezcla-3.2-1B (Merge)
Novaciano_Sigil-Of-Satan-3.2-1B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Novaciano/Sigil-Of-Satan-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Novaciano/Sigil-Of-Satan-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Novaciano__Sigil-Of-Satan-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Novaciano/Sigil-Of-Satan-3.2-1B
021456428c58707f505db03ee05777b2edbb8652
13.69228
0
1.498
false
false
false
true
0.3714
0.549423
54.942331
0.354586
9.400066
0.054381
5.438066
0.260906
1.454139
0.327615
1.41849
0.185505
9.500591
false
false
2025-03-08
2025-03-10
1
Novaciano/Sigil-Of-Satan-3.2-1B (Merge)
NucleusAI_nucleus-22B-token-500B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NucleusAI/nucleus-22B-token-500B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NucleusAI/nucleus-22B-token-500B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NucleusAI__nucleus-22B-token-500B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NucleusAI/nucleus-22B-token-500B
49bb1a47c0d32b4bfa6630a4eff04a857adcd4ca
1.633416
mit
25
21.828
true
false
false
false
1.189635
0.025654
2.565415
0.29198
1.887999
0
0
0.25
0
0.351052
3.548177
0.11619
1.798907
false
false
2023-10-06
2024-06-26
0
NucleusAI/nucleus-22B-token-500B
NyxKrage_Microsoft_Phi-4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/NyxKrage/Microsoft_Phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NyxKrage/Microsoft_Phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NyxKrage__Microsoft_Phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NyxKrage/Microsoft_Phi-4
d6e415636fc3435ec1cf543db77cf228b6ce6bdd
30.068601
other
56
14.66
true
false
false
false
1.788963
0.058527
5.852693
0.669056
52.427848
0.299094
29.909366
0.40604
20.805369
0.503354
23.785938
0.528674
47.630393
false
false
2024-12-13
2024-12-21
0
NyxKrage/Microsoft_Phi-4
OEvortex_Emotional-llama-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/Emotional-llama-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/Emotional-llama-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__Emotional-llama-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/Emotional-llama-8B
7d16f2e5354dd8f62ce46e47580bfafbc9d4eabd
17.789126
llama3
8
8.03
true
false
false
false
1.363628
0.351637
35.163699
0.483857
26.454054
0.081571
8.1571
0.294463
5.928412
0.365875
2.867708
0.353474
28.163785
false
false
2024-05-05
2024-12-29
0
OEvortex/Emotional-llama-8B
OEvortex_HelpingAI-15B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI-15B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI-15B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI-15B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI-15B
fcc5d4eeee08c07680a2560a302de3eaa5d6f550
4.515496
other
13
15.323
true
false
false
true
2.454475
0.203009
20.300913
0.293601
1.815381
0
0
0.25755
1.006711
0.361875
2.734375
0.11112
1.235594
false
false
2024-07-11
2024-07-13
0
OEvortex/HelpingAI-15B
OEvortex_HelpingAI-3B-reloaded_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI-3B-reloaded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI-3B-reloaded</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI-3B-reloaded-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI-3B-reloaded
aaee653fea06ba322e7a9ed15530db605cc3b382
14.768421
other
2
2.81
true
false
false
true
1.123253
0.464668
46.466819
0.412851
16.98574
0.013595
1.359517
0.263423
1.789709
0.352448
4.289323
0.259475
17.719415
false
false
2024-10-31
2024-10-31
0
OEvortex/HelpingAI-3B-reloaded
OEvortex_HelpingAI2-9B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI2-9B
b45a18cf41d0d438d71d79687e098ec60dd0aec1
17.606927
other
23
8.903
true
false
false
true
2.081303
0.441312
44.131238
0.484462
27.073242
0.058912
5.891239
0.258389
1.118568
0.371083
6.31875
0.289977
21.108525
false
false
2024-08-16
2024-10-11
0
OEvortex/HelpingAI2-9B
OEvortex_HelpingAI2.5-10B_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI2.5-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI2.5-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI2.5-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI2.5-10B
25ac750b886c7e42521c769e6c2cd2b1143cfbcc
13.711774
other
4
10.211
true
false
false
true
1.878787
0.327656
32.765617
0.449566
21.135366
0.020393
2.039275
0.269295
2.572707
0.373813
6.259896
0.25748
17.497784
false
false
2024-11-17
2024-11-19
0
OEvortex/HelpingAI2.5-10B
OliveiraJLT_Sagui-7B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OliveiraJLT/Sagui-7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OliveiraJLT/Sagui-7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OliveiraJLT__Sagui-7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OliveiraJLT/Sagui-7B-Instruct-v0.1
e3032ba89a6df12b801ab3be2a29b59068aa048d
8.579407
other
0
6.738
true
false
false
true
1.647092
0.289163
28.916275
0.311068
5.043572
0.015106
1.510574
0.24245
0
0.419052
10.614844
0.148521
5.391179
false
false
2024-07-17
2024-07-18
1
maritaca-ai/sabia-7b
Omkar1102_code-yi_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Omkar1102/code-yi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Omkar1102/code-yi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Omkar1102__code-yi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Omkar1102/code-yi
7e875c1d64029d1f8db6813bd2b715cb5406b745
4.921767
0
2.084
false
false
false
false
0.44252
0.214775
21.477458
0.276006
1.844158
0
0
0.250839
0.111857
0.380229
4.695313
0.112616
1.401817
false
false
2024-11-16
0
Removed
Omkar1102_code-yi_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Omkar1102/code-yi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Omkar1102/code-yi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Omkar1102__code-yi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Omkar1102/code-yi
7e875c1d64029d1f8db6813bd2b715cb5406b745
5.1703
0
2.084
false
false
false
false
1.281359
0.225441
22.544072
0.275003
1.581399
0
0
0.25755
1.006711
0.376198
4.52474
0.112284
1.364879
false
false
2024-11-16
0
Removed
OmnicromsBrain_NeuralStar_FusionWriter_4x7b_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/OmnicromsBrain/NeuralStar_FusionWriter_4x7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OmnicromsBrain/NeuralStar_FusionWriter_4x7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OmnicromsBrain__NeuralStar_FusionWriter_4x7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OmnicromsBrain/NeuralStar_FusionWriter_4x7b
fbe296d2c76acbb792cdd22e14d1c8bb13723839
20.071645
apache-2.0
7
24.154
true
true
false
true
2.75293
0.596384
59.638426
0.477624
26.03844
0.049094
4.909366
0.278523
3.803132
0.401875
8.201042
0.260555
17.839465
true
false
2024-06-07
2024-07-01
1
OmnicromsBrain/NeuralStar_FusionWriter_4x7b (Merge)
OnlyCheeini_greesychat-turbo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OnlyCheeini/greesychat-turbo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OnlyCheeini/greesychat-turbo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OnlyCheeini__greesychat-turbo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OnlyCheeini/greesychat-turbo
6c050859a63f8a677c52aef226fd64705fdf2fa9
1.802069
apache-2.0
0
8.03
true
false
false
true
0.510857
0.023256
2.325607
0.309213
4.01837
0
0
0.260067
1.342282
0.331427
1.595052
0.11378
1.531102
false
false
2024-08-26
2024-12-18
2
Removed
Open-Orca_Mistral-7B-OpenOrca_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Open-Orca/Mistral-7B-OpenOrca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Open-Orca__Mistral-7B-OpenOrca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Open-Orca/Mistral-7B-OpenOrca
4a37328cef00f524d3791b1c0cc559a3cc6af14d
17.721651
apache-2.0
681
7
true
false
false
true
1.06716
0.497766
49.776593
0.476817
25.840025
0.035498
3.549849
0.271812
2.908277
0.385781
5.889323
0.265293
18.365839
false
true
2023-09-29
2024-06-12
0
Open-Orca/Mistral-7B-OpenOrca
OpenAssistant_oasst-sft-1-pythia-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/OpenAssistant/oasst-sft-1-pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenAssistant/oasst-sft-1-pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenAssistant__oasst-sft-1-pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenAssistant/oasst-sft-1-pythia-12b
293df535fe7711a5726987fc2f17dfc87de452a1
3.68183
apache-2.0
278
12
true
false
false
false
1.776114
0.105539
10.553886
0.314663
4.778509
0.015106
1.510574
0.25755
1.006711
0.332698
2.98724
0.111287
1.254063
false
true
2023-03-09
2024-06-12
0
OpenAssistant/oasst-sft-1-pythia-12b
OpenBuddy_openbuddy-falcon3-10b-v24.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-falcon3-10b-v24.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-falcon3-10b-v24.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-falcon3-10b-v24.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-falcon3-10b-v24.2-131k
6fd3ac4042bd7d05336182f24b3b3380a064756e
27.273413
0
10.34
false
false
false
true
1.640639
0.508632
50.863154
0.600373
42.726361
0.212991
21.299094
0.299497
6.599553
0.418646
10.664063
0.383394
31.488254
false
false
2025-01-01
2025-01-02
1
tiiuae/Falcon3-10B-Base
OpenBuddy_openbuddy-llama3-70b-v21.2-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3-70b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3-70b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3-70b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3-70b-v21.2-32k
e79a2f16c052fc76eeafb5b51d16261b2b981d0f
35.553458
other
2
70.554
true
false
false
true
26.071872
0.701048
70.104766
0.650744
49.969366
0.203172
20.317221
0.342282
12.304251
0.457969
18.046094
0.483211
42.579048
false
false
2024-06-12
2024-09-05
0
OpenBuddy/openbuddy-llama3-70b-v21.2-32k
OpenBuddy_openbuddy-llama3-8b-v21.1-8k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3-8b-v21.1-8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3-8b-v21.1-8k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3-8b-v21.1-8k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3-8b-v21.1-8k
658508bce03ccd61cea9657e0357bd4cd10503ba
20.162938
other
30
8.03
true
false
false
true
1.638844
0.556967
55.696663
0.47875
26.115045
0.043051
4.305136
0.270973
2.796421
0.398771
10.346354
0.295462
21.718011
false
false
2024-04-20
2024-08-03
0
OpenBuddy/openbuddy-llama3-8b-v21.1-8k
OpenBuddy_openbuddy-llama3-8b-v21.2-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3-8b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3-8b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3-8b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3-8b-v21.2-32k
f3ea2dec2533a3dd97df32db2376b17875cafda2
22.069479
other
0
8.03
true
false
false
true
1.699779
0.61919
61.919041
0.485622
27.252335
0.07855
7.854985
0.279362
3.914989
0.377875
5.934375
0.32987
25.54115
false
false
2024-06-18
2024-06-26
0
OpenBuddy/openbuddy-llama3-8b-v21.2-32k
OpenBuddy_openbuddy-llama3.1-70b-v22.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.1-70b-v22.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k
43ed945180174d79a8f6c68509161c249c884dfa
41.249473
other
1
70.554
true
false
false
true
24.383247
0.733271
73.327105
0.669849
51.940776
0.395015
39.501511
0.375
16.666667
0.462958
18.236458
0.530419
47.82432
false
false
2024-08-21
2024-08-24
0
OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k
OpenBuddy_openbuddy-llama3.1-8b-v22.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.1-8b-v22.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k
0d9d85c7a5e4292e07c346147de56bd3991d525c
24.418173
other
2
8.03
true
false
false
true
1.582126
0.665727
66.572694
0.500652
29.057538
0.114804
11.480363
0.279362
3.914989
0.408104
9.813021
0.331034
25.670434
false
false
2024-07-28
2024-07-29
0
OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k
OpenBuddy_openbuddy-llama3.1-8b-v22.3-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.1-8b-v22.3-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k
0097358fa1a450251b7ea1a03a5effdfded6c461
23.317954
other
2
8.03
true
false
false
true
1.668446
0.599707
59.970656
0.506591
30.319511
0.120846
12.084592
0.279362
3.914989
0.401469
8.316927
0.327709
25.301049
false
false
2024-08-16
2024-08-24
0
OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k
OpenBuddy_openbuddy-llama3.2-1b-v23.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.2-1b-v23.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k
71b61e0e02e55553902f0051074d2ae965413cdb
9.350034
llama3.2
3
1.498
true
false
false
true
0.892285
0.359005
35.900522
0.326656
6.04362
0.024924
2.492447
0.258389
1.118568
0.334219
1.210677
0.184009
9.334368
false
false
2024-10-07
2024-10-09
0
OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k
OpenBuddy_openbuddy-llama3.2-3b-v23.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.2-3b-v23.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k
7cd2baa3d9bb99e970d711fb7afe786753bc25ea
13.797654
llama3.2
0
3.607
true
false
false
true
1.391144
0.431945
43.194502
0.407266
16.588826
0.026435
2.643505
0.276007
3.467562
0.326313
0.455729
0.247922
16.435801
false
false
2024-10-14
2024-10-15
0
OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k
OpenBuddy_openbuddy-llama3.3-70b-v24.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.3-70b-v24.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.3-70b-v24.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.3-70b-v24.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.3-70b-v24.1-131k
b4585368e97d413b400503ca9ee2b8e4a8988614
45.736795
llama3.3
1
70.554
true
false
false
true
67.525547
0.812081
81.208083
0.685804
54.146648
0.441088
44.108761
0.434564
24.608501
0.486927
22.265885
0.532746
48.08289
false
false
2024-12-08
2024-12-08
0
OpenBuddy/openbuddy-llama3.3-70b-v24.1-131k
OpenBuddy_openbuddy-mixtral-7bx8-v18.1-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k
98596b6731058cc9cca85f3b8ac9077342cb60ae
22.329809
apache-2.0
14
46.741
true
true
false
true
9.737751
0.549348
54.934795
0.465618
24.535443
0.108006
10.800604
0.30453
7.270694
0.383052
5.28151
0.380402
31.155807
false
false
2024-02-12
2024-06-26
0
OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k
OpenBuddy_openbuddy-nemotron-70b-v23.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-nemotron-70b-v23.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-nemotron-70b-v23.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-nemotron-70b-v23.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-nemotron-70b-v23.1-131k
d8cb98fb9281a84eb0df8216bae60beaf5181921
39.785051
llama3.1
4
70.554
true
false
false
true
36.442433
0.755528
75.552756
0.674947
53.188049
0.320997
32.099698
0.363255
15.100671
0.45375
16.385417
0.517453
46.383717
false
false
2024-10-20
2024-10-23
3
meta-llama/Meta-Llama-3.1-70B
OpenBuddy_openbuddy-nemotron-70b-v23.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-nemotron-70b-v23.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-nemotron-70b-v23.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-nemotron-70b-v23.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-nemotron-70b-v23.2-131k
7a39fd93b078189c6892344c2f01059320543e2f
39.234122
llama3.1
2
70.554
true
false
false
true
24.789816
0.722655
72.265478
0.670481
52.265662
0.31571
31.570997
0.359899
14.653244
0.469594
18.865885
0.512051
45.783466
false
false
2024-10-24
2024-10-24
3
meta-llama/Meta-Llama-3.1-70B
OpenBuddy_openbuddy-qwen2.5llamaify-14b-v23.1-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwen2.5llamaify-14b-v23.1-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k
001e14063e2702a9b2284dc6ec889d2586dc839b
32.528299
apache-2.0
1
14.77
true
false
false
true
2.9135
0.630881
63.088051
0.60132
43.276499
0.253776
25.377644
0.333054
11.073826
0.424042
11.538542
0.467337
40.815233
false
false
2024-09-23
2024-09-23
0
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k
OpenBuddy_openbuddy-qwen2.5llamaify-14b-v23.3-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwen2.5llamaify-14b-v23.3-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k
0cef6f7719c1eb3bc1ebba133508c2c6d67e635c
32.297915
apache-2.0
5
14.77
true
false
false
true
2.995898
0.613145
61.314534
0.608086
44.18394
0.231118
23.111782
0.327181
10.290828
0.434583
12.722917
0.479471
42.16349
false
false
2024-10-02
2024-10-11
0
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k
OpenBuddy_openbuddy-qwen2.5llamaify-7b-v23.1-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwen2.5llamaify-7b-v23.1-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwen2.5llamaify-7b-v23.1-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwen2.5llamaify-7b-v23.1-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwen2.5llamaify-7b-v23.1-200k
91521abfec2a00f4853f6cb4dd620177617ca572
27.863255
apache-2.0
3
7.615
true
false
false
true
2.763734
0.567258
56.725821
0.550938
36.398128
0.188822
18.882175
0.314597
8.612975
0.436323
13.807031
0.394781
32.753398
false
false
2024-10-04
2024-10-10
2
Qwen/Qwen2.5-7B
OpenBuddy_openbuddy-qwq-32b-v24.1-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwq-32b-v24.1-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwq-32b-v24.1-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwq-32b-v24.1-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwq-32b-v24.1-200k
1e9dfa19793d79c53999e2f22794d2c310180c7e
39.962325
apache-2.0
3
32.764
true
false
false
true
6.906929
0.593661
59.366148
0.67985
54.469176
0.373867
37.386707
0.380872
17.449664
0.484875
21.209375
0.549036
49.892878
false
false
2024-12-21
2024-12-22
3
Qwen/Qwen2.5-32B
OpenBuddy_openbuddy-qwq-32b-v24.2-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwq-32b-v24.2-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwq-32b-v24.2-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwq-32b-v24.2-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwq-32b-v24.2-200k
ee9f46f325a4b68d3b06d1dcabc2d81f42df6682
39.560112
apache-2.0
3
32.764
true
false
false
true
6.04694
0.596984
59.698378
0.677154
54.163496
0.377644
37.76435
0.376678
16.89038
0.471792
19.440625
0.544631
49.403443
false
false
2024-12-29
2024-12-30
3
Qwen/Qwen2.5-32B
OpenBuddy_openbuddy-yi1.5-34b-v21.3-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-yi1.5-34b-v21.3-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k
966be6ad502cdd50a9af94d5f003aec040cdb0b5
30.924704
apache-2.0
0
34.407
true
false
false
true
6.076653
0.542004
54.20041
0.616257
45.637093
0.178248
17.824773
0.348993
13.199105
0.443948
14.69349
0.45994
39.993351
false
false
2024-06-05
2024-08-30
1
OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k (Merge)
OpenBuddy_openbuddy-zero-14b-v22.3-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-zero-14b-v22.3-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-zero-14b-v22.3-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-zero-14b-v22.3-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-zero-14b-v22.3-32k
d9a0b6bc02f283e154c9ad6db43a5a97eed97f5b
19.406071
other
1
14.022
true
false
false
true
3.377538
0.375292
37.5292
0.485976
26.289507
0.093656
9.365559
0.307047
7.606264
0.416604
11.342188
0.318733
24.303709
false
false
2024-07-16
2024-07-29
0
OpenBuddy/openbuddy-zero-14b-v22.3-32k
OpenBuddy_openbuddy-zero-3b-v21.2-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-zero-3b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-zero-3b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-zero-3b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-zero-3b-v21.2-32k
74e1d168c5e917219d668d1483f6355dd0464a31
11.713302
other
2
4.769
true
false
false
true
1.757238
0.380238
38.023777
0.393479
15.293406
0.018882
1.888218
0.260067
1.342282
0.356635
2.246094
0.203374
11.486037
false
false
2024-06-02
2024-06-26
0
OpenBuddy/openbuddy-zero-3b-v21.2-32k
OpenBuddy_openbuddy-zero-56b-v21.2-32k_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-zero-56b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-zero-56b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-zero-56b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-zero-56b-v21.2-32k
c7a1a4a6e798f75d1d3219ab9ff9f2692e29f7d5
28.53602
other
0
56.707
true
false
false
true
15.147492
0.505709
50.57093
0.612835
44.796542
0.162387
16.238671
0.317953
9.060403
0.430521
12.781771
0.43991
37.767804
false
false
2024-06-10
2024-06-26
0
OpenBuddy/openbuddy-zero-56b-v21.2-32k
OpenGenerativeAI_Bifrost_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenGenerativeAI/Bifrost" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenGenerativeAI/Bifrost</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenGenerativeAI__Bifrost-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenGenerativeAI/Bifrost
7525b3cd69b258aaac8897aa2bff8e9de89f5767
37.147464
apache-2.0
0
14.66
true
false
false
true
1.710812
0.634752
63.475246
0.684927
55.097009
0.254532
25.453172
0.368289
15.771812
0.45976
16.870052
0.515957
46.217494
false
false
2025-03-06
2025-03-07
1
OpenGenerativeAI/Bifrost (Merge)
OpenGenerativeAI_Bifrost-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenGenerativeAI/Bifrost-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenGenerativeAI/Bifrost-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenGenerativeAI__Bifrost-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenGenerativeAI/Bifrost-14B
2f63272826f4a218a00e6a84d1bd1acb023ae613
37.399635
apache-2.0
0
14.66
true
false
false
true
0.863863
0.66153
66.15303
0.68449
55.088066
0.23565
23.564955
0.379195
17.225951
0.462396
17.099479
0.507397
45.266327
false
false
2025-03-07
2025-03-07
1
OpenGenerativeAI/Bifrost-14B (Merge)
OpenLLM-France_Lucie-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenLLM-France/Lucie-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLLM-France/Lucie-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenLLM-France__Lucie-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenLLM-France/Lucie-7B
39b6dd46c3d39b2b2a61523093d1cf6c8a24730f
8.611732
apache-2.0
19
6.707
true
false
false
false
0.954892
0.249645
24.964539
0.349247
9.913943
0.01435
1.435045
0.272651
3.020134
0.392323
6.807031
0.149767
5.529699
false
false
2024-10-10
2025-01-14
0
OpenLLM-France/Lucie-7B
OpenLLM-France_Lucie-7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenLLM-France/Lucie-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLLM-France/Lucie-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenLLM-France__Lucie-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenLLM-France/Lucie-7B-Instruct
4632991a54e8713b234302164ff171d909a121f9
8.364897
0
6.707
false
false
false
true
0.979895
0.279646
27.964578
0.325404
7.261384
0.016616
1.661631
0.279362
3.914989
0.366219
3.210677
0.155585
6.176123
false
false
2025-01-14
0
Removed
OpenLLM-France_Lucie-7B-Instruct-human-data_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenLLM-France/Lucie-7B-Instruct-human-data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLLM-France/Lucie-7B-Instruct-human-data</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenLLM-France__Lucie-7B-Instruct-human-data-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenLLM-France/Lucie-7B-Instruct-human-data
4842e3bd24e1037658aff7ca3dbc6b6973bb38f4
8.574867
apache-2.0
6
6.707
true
false
false
true
0.975152
0.294608
29.460831
0.328425
7.629771
0.021903
2.190332
0.275168
3.355705
0.372854
4.040104
0.142952
4.772459
false
false
2025-01-07
2025-01-14
1
OpenLLM-France/Lucie-7B-Instruct-human-data (Merge)
OpenLLM-France_Lucie-7B-Instruct-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenLLM-France/Lucie-7B-Instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLLM-France/Lucie-7B-Instruct-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenLLM-France__Lucie-7B-Instruct-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenLLM-France/Lucie-7B-Instruct-v1.1
204e1880f28b63db2687d174e297548a10e719cb
10.999353
apache-2.0
8
6.707
true
false
false
false
0.487936
0.303876
30.387594
0.381588
14.739314
0.031722
3.172205
0.281879
4.250559
0.375021
3.844271
0.18642
9.602172
false
false
2025-02-13
2025-02-25
1
OpenLLM-France/Lucie-7B-Instruct-v1.1 (Merge)
OpenLeecher_llama3-8b-lima_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenLeecher/llama3-8b-lima" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLeecher/llama3-8b-lima</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenLeecher__llama3-8b-lima-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenLeecher/llama3-8b-lima
237a2bcb240eecd9355a091f839e42ba3d31bda5
15.025432
0
8.03
false
false
false
true
1.917859
0.437066
43.706587
0.429583
19.573065
0.050604
5.060423
0.238255
0
0.371271
3.742187
0.262633
18.070331
false
false
2024-10-01
0
Removed
OpenScholar_Llama-3.1_OpenScholar-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenScholar/Llama-3.1_OpenScholar-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenScholar/Llama-3.1_OpenScholar-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenScholar__Llama-3.1_OpenScholar-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenScholar/Llama-3.1_OpenScholar-8B
e26aeb22af568bd8d01ffde86ebbd13c3cf4fcc5
25.961332
apache-2.0
57
8
true
false
false
true
1.265213
0.606401
60.640102
0.520774
32.403921
0.165408
16.540785
0.281879
4.250559
0.42751
11.838802
0.370844
30.093824
false
false
2024-11-15
2024-12-03
1
OpenScholar/Llama-3.1_OpenScholar-8B (Merge)
Orenguteng_Llama-3.1-8B-Lexi-Uncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orenguteng/Llama-3.1-8B-Lexi-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orenguteng__Llama-3.1-8B-Lexi-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orenguteng/Llama-3.1-8B-Lexi-Uncensored
56ac439ab4c7826871493ffbe2d49f2100a98e97
27.175116
llama3.1
46
8.03
true
false
false
true
1.71347
0.777684
77.768432
0.505726
29.242543
0.1571
15.70997
0.271812
2.908277
0.387115
6.422656
0.378989
30.998818
false
false
2024-07-26
2024-07-29
0
Orenguteng/Llama-3.1-8B-Lexi-Uncensored
Orenguteng_Llama-3.1-8B-Lexi-Uncensored-V2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orenguteng__Llama-3.1-8B-Lexi-Uncensored-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
2340f8fbcd2452125a798686ca90b882a08fb0d9
28.390881
llama3.1
180
8.03
true
false
false
true
1.739373
0.779158
77.915819
0.508401
29.687033
0.19713
19.712991
0.282718
4.362416
0.384292
7.769792
0.378075
30.897237
false
false
2024-08-09
2024-08-28
0
Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
Orion-zhen_Qwen2.5-7B-Instruct-Uncensored_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Orion-zhen/Qwen2.5-7B-Instruct-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orion-zhen/Qwen2.5-7B-Instruct-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orion-zhen__Qwen2.5-7B-Instruct-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orion-zhen/Qwen2.5-7B-Instruct-Uncensored
33c24657b4394fc430ad90b5d413e5985ce8e292
35.718815
gpl-3.0
18
7.616
true
false
false
true
2.233624
0.720432
72.043179
0.547392
35.832453
0.477341
47.734139
0.302852
7.04698
0.436135
13.583594
0.442653
38.072547
false
false
2024-09-26
2024-10-19
1
Orion-zhen/Qwen2.5-7B-Instruct-Uncensored (Merge)
Orion-zhen_phi-4-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Orion-zhen/phi-4-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orion-zhen/phi-4-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orion-zhen__phi-4-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orion-zhen/phi-4-abliterated
90e3bfb1a9507d931c19faa5c2084d3f8d0bfb77
29.979077
gpl-3.0
28
14.66
true
false
false
false
1.788702
0.057603
5.760272
0.669824
52.457129
0.302115
30.21148
0.404362
20.581655
0.500625
23.178125
0.529172
47.685801
false
false
2024-12-17
2024-12-20
1
Orion-zhen/phi-4-abliterated (Merge)
P0x0_Astra-v1-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/P0x0/Astra-v1-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">P0x0/Astra-v1-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/P0x0__Astra-v1-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
P0x0/Astra-v1-12B
c706e253f8d8fa838b505cbec0e1a6aeec545abc
19.73724
apache-2.0
2
12.248
true
false
false
false
3.211347
0.280594
28.059438
0.521451
31.809907
0.113293
11.329305
0.313758
8.501119
0.405188
11.381771
0.346077
27.341903
false
false
2024-09-21
2024-09-23
1
mistralai/Mistral-Nemo-Base-2407
PJMixers_LLaMa-3-CursedStock-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers/LLaMa-3-CursedStock-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers/LLaMa-3-CursedStock-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers__LLaMa-3-CursedStock-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers/LLaMa-3-CursedStock-v2.0-8B
d47cc29df363f71ffaf6cd21ac4bdeefa27359db
24.166134
llama3
12
8.03
true
false
false
true
2.805384
0.633079
63.307912
0.527116
32.563612
0.094411
9.441088
0.274329
3.243848
0.385625
8.036458
0.355635
28.403886
true
false
2024-06-26
2024-06-27
1
PJMixers/LLaMa-3-CursedStock-v2.0-8B (Merge)
PJMixers-Dev_L3.2-Instruct-Thinking-v0.1-1B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/L3.2-Instruct-Thinking-v0.1-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/L3.2-Instruct-Thinking-v0.1-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__L3.2-Instruct-Thinking-v0.1-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/L3.2-Instruct-Thinking-v0.1-1B
da482263c3258481e235117b58977f01bd9f9e25
10.902648
0
1.236
false
false
false
true
0.741031
0.46277
46.276989
0.330181
6.386637
0.054381
5.438066
0.25755
1.006711
0.326219
0.94401
0.148271
5.363475
false
false
2025-01-11
0
Removed
PJMixers-Dev_LLaMa-3.1-Instruct-Interleaved-Zeroed-13B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.1-Instruct-Interleaved-Zeroed-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.1-Instruct-Interleaved-Zeroed-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.1-Instruct-Interleaved-Zeroed-13B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.1-Instruct-Interleaved-Zeroed-13B
c5b8d7fa43a013e434630a7f89f3bf15ac19606f
28.894562
llama3.1
1
13.047
true
false
false
true
2.592731
0.787102
78.710156
0.507327
29.892756
0.200151
20.015106
0.291946
5.592841
0.38699
8.407031
0.376745
30.749483
true
false
2024-12-18
2024-12-18
1
PJMixers-Dev/LLaMa-3.1-Instruct-Interleaved-Zeroed-13B (Merge)
PJMixers-Dev_LLaMa-3.1-RomboTiesTest-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.1-RomboTiesTest-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.1-RomboTiesTest-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.1-RomboTiesTest-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.1-RomboTiesTest-8B
87ff7bb5f3399c4a4c021dbcbd5f2b5f52eedab2
28.818375
0
4.015
false
false
false
true
1.436904
0.78253
78.253035
0.507327
29.892756
0.200151
20.015106
0.291946
5.592841
0.38699
8.407031
0.376745
30.749483
false
false
2024-12-20
0
Removed
PJMixers-Dev_LLaMa-3.1-RomboTiesTest2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.1-RomboTiesTest2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.1-RomboTiesTest2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.1-RomboTiesTest2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.1-RomboTiesTest2-8B
5bc997080bae6df93298edb5a82b1391dc972047
28.818375
0
4.015
false
false
false
true
1.441047
0.78253
78.253035
0.507327
29.892756
0.200151
20.015106
0.291946
5.592841
0.38699
8.407031
0.376745
30.749483
false
false
2024-12-21
0
Removed
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B
1286f51489b06fe67fa36d57aa87331fa37e698b
22.70174
llama3.2
0
3.213
true
false
false
true
1.427389
0.693054
69.305443
0.455617
23.808307
0.121601
12.160121
0.274329
3.243848
0.370031
4.053906
0.312749
23.638815
false
false
2024-10-12
2024-10-12
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B
4c348a8dfc1be0b4985e0ed2882329515a60c19d
21.772674
llama3.2
1
3.213
true
false
false
true
1.419798
0.629157
62.91573
0.45815
23.34124
0.129909
12.990937
0.272651
3.020134
0.365875
4.867708
0.311503
23.500296
false
false
2024-10-14
2024-10-14
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B
17b245cfcffcc6aadc90989bf08d9625455064e1
21.826267
llama3.2
1
3.213
true
false
false
true
1.358084
0.65039
65.038985
0.451079
22.288715
0.126133
12.613293
0.271812
2.908277
0.368729
4.691146
0.310755
23.417184
false
false
2024-10-28
2024-10-28
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMixBread-v0.1-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMixBread-v0.1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B
19faf7463cab41a2492cad26fc54b2fce3a05caf
19.737296
llama3.2
0
3.213
true
false
false
true
1.404855
0.504086
50.408583
0.448316
22.759588
0.130665
13.066465
0.282718
4.362416
0.351552
4.677344
0.308344
23.149379
true
false
2024-10-12
2024-10-12
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B (Merge)
PJMixers-Dev_Qwen2.5-RomboTiesTest-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/Qwen2.5-RomboTiesTest-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/Qwen2.5-RomboTiesTest-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__Qwen2.5-RomboTiesTest-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/Qwen2.5-RomboTiesTest-7B
61e798861cae00ff1108708fc89ed18bccaf1170
35.288854
0
3.808
false
false
false
true
1.342766
0.755802
75.580238
0.539867
34.931457
0.496224
49.622356
0.297819
6.375839
0.403365
8.720573
0.428524
36.50266
false
false
2024-12-21
0
Removed
Parissa3_test-model_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Parissa3/test-model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Parissa3/test-model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Parissa3__test-model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Parissa3/test-model
7021138dac98d930f1ce0ebe186583c0813d6f48
20.745918
0
7.242
false
false
false
false
0.946706
0.388256
38.825649
0.519392
32.839032
0.064955
6.495468
0.294463
5.928412
0.468531
17.533073
0.305685
22.853871
false
false
2024-11-16
2024-11-16
1
Parissa3/test-model (Merge)
Pinkstack_PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pinkstack__PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B
2f964ba6516a9f1e2cbec4c3decde734c340a739
23.928769
apache-2.0
2
3.086
true
false
false
false
1.560798
0.508482
50.848194
0.471057
26.330926
0.169184
16.918429
0.29698
6.263982
0.447854
15.315104
0.351064
27.895981
false
false
2024-12-17
2025-01-16
1
Pinkstack/PARM-V1.5-base-QwQ-Qwen-2.5-o1-3B (Merge)
Pinkstack_SuperThoughts-CoT-14B-16k-o1-QwQ_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pinkstack__SuperThoughts-CoT-14B-16k-o1-QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ
06261f48e94e86201b527854e76ddee3c65054f4
31.369512
mit
7
14.66
true
false
false
false
1.863034
0.051458
5.145791
0.671999
52.848497
0.41994
41.993958
0.392617
19.01566
0.491354
21.785938
0.526845
47.427231
false
false
2025-01-14
2025-01-16
1
Pinkstack/SuperThoughts-CoT-14B-16k-o1-QwQ (Merge)
Pinkstack_Superthoughts-lite-1.8B-experimental-o1_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Pinkstack/Superthoughts-lite-1.8B-experimental-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pinkstack/Superthoughts-lite-1.8B-experimental-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pinkstack__Superthoughts-lite-1.8B-experimental-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pinkstack/Superthoughts-lite-1.8B-experimental-o1
dc8f7e9d19f3d4bee7cfd81cc94c320204673dee
5.104091
apache-2.0
1
1.812
true
false
false
false
0.618301
0.037519
3.751934
0.343474
9.132473
0.031722
3.172205
0.275168
3.355705
0.335396
1.757812
0.18509
9.454418
false
false
2025-01-31
2025-01-31
2
HuggingFaceTB/SmolLM2-1.7B-Instruct (Merge)