eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
awnr_Mistral-7B-v0.1-signtensors-1-over-2_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-1-over-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-1-over-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-1-over-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
awnr/Mistral-7B-v0.1-signtensors-1-over-2
|
9575327242f8539eac59b6d788beccf54a6f9414
| 14.257194 |
apache-2.0
| 2 | 7 | true | false | false | false | 2.567058 | 0.217922 | 21.792178 | 0.442288 | 22.400153 | 0.02719 | 2.719033 | 0.307047 | 7.606264 | 0.400604 | 8.808854 | 0.29995 | 22.216681 | false | false |
2024-06-27
|
2024-07-30
| 0 |
awnr/Mistral-7B-v0.1-signtensors-1-over-2
|
awnr_Mistral-7B-v0.1-signtensors-1-over-4_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-1-over-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-1-over-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-1-over-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
awnr/Mistral-7B-v0.1-signtensors-1-over-4
|
b288ab9d8adfd2963a44a7935bb47649f55bcbee
| 8.709433 |
apache-2.0
| 1 | 7 | true | false | false | false | 1.28233 | 0.213301 | 21.330071 | 0.350709 | 9.227694 | 0.022659 | 2.265861 | 0.270134 | 2.684564 | 0.346031 | 2.18724 | 0.231051 | 14.56117 | false | false |
2024-07-29
|
2024-07-29
| 0 |
awnr/Mistral-7B-v0.1-signtensors-1-over-4
|
awnr_Mistral-7B-v0.1-signtensors-3-over-8_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-3-over-8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-3-over-8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-3-over-8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
awnr/Mistral-7B-v0.1-signtensors-3-over-8
|
fa368f705ace05da2fef25c030fe740cf1fef176
| 13.725352 |
apache-2.0
| 1 | 7 | true | false | false | false | 1.270991 | 0.239429 | 23.942916 | 0.429994 | 20.435231 | 0.027946 | 2.794562 | 0.303691 | 7.158837 | 0.38175 | 5.785417 | 0.300116 | 22.235151 | false | false |
2024-07-29
|
2024-07-29
| 0 |
awnr/Mistral-7B-v0.1-signtensors-3-over-8
|
awnr_Mistral-7B-v0.1-signtensors-5-over-16_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-5-over-16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-5-over-16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-5-over-16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
awnr/Mistral-7B-v0.1-signtensors-5-over-16
|
5ea13b3d0723237889e1512bc70dae72f71884d1
| 12.158648 |
apache-2.0
| 1 | 7 | true | false | false | false | 0.649735 | 0.211827 | 21.182684 | 0.412415 | 17.543031 | 0.021903 | 2.190332 | 0.28104 | 4.138702 | 0.368604 | 6.142188 | 0.295795 | 21.75495 | false | false |
2024-07-29
|
2024-07-29
| 0 |
awnr/Mistral-7B-v0.1-signtensors-5-over-16
|
awnr_Mistral-7B-v0.1-signtensors-7-over-16_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-7-over-16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-7-over-16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-7-over-16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
awnr/Mistral-7B-v0.1-signtensors-7-over-16
|
0e1f2cb0a81c38fc6c567d9c007883ab62fae266
| 14.146 |
apache-2.0
| 1 | 7 | true | false | false | false | 1.296433 | 0.229363 | 22.936254 | 0.431582 | 21.040437 | 0.032477 | 3.247734 | 0.303691 | 7.158837 | 0.395208 | 7.934375 | 0.303025 | 22.558363 | false | false |
2024-07-29
|
2024-07-29
| 0 |
awnr/Mistral-7B-v0.1-signtensors-7-over-16
|
aws-prototyping_MegaBeam-Mistral-7B-512k_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/aws-prototyping/MegaBeam-Mistral-7B-512k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aws-prototyping/MegaBeam-Mistral-7B-512k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aws-prototyping__MegaBeam-Mistral-7B-512k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
aws-prototyping/MegaBeam-Mistral-7B-512k
|
3e3b8c4b933650eed81ede7c4395df943d2a0796
| 17.59507 |
apache-2.0
| 44 | 7 | true | false | false | true | 0.647188 | 0.597259 | 59.725861 | 0.366234 | 12.361178 | 0.029456 | 2.945619 | 0.282718 | 4.362416 | 0.399365 | 8.520573 | 0.258893 | 17.654772 | false | false |
2024-07-30
|
2024-10-07
| 0 |
aws-prototyping/MegaBeam-Mistral-7B-512k
|
axolotl-ai-co_romulus-mistral-nemo-12b-simpo_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/axolotl-ai-co/romulus-mistral-nemo-12b-simpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">axolotl-ai-co/romulus-mistral-nemo-12b-simpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/axolotl-ai-co__romulus-mistral-nemo-12b-simpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
axolotl-ai-co/romulus-mistral-nemo-12b-simpo
|
15fd3ffa46c1ea51aa5d26a1da24214e324d7cf2
| 23.426338 |
apache-2.0
| 15 | 12 | true | false | false | true | 2.715152 | 0.607925 | 60.792475 | 0.539506 | 34.642401 | 0.009063 | 0.906344 | 0.278523 | 3.803132 | 0.423302 | 12.979427 | 0.346908 | 27.434249 | false | false |
2024-07-24
|
2024-09-21
| 1 |
Removed
|
beomi_gemma-mling-7b_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/beomi/gemma-mling-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">beomi/gemma-mling-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/beomi__gemma-mling-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
beomi/gemma-mling-7b
|
3f442e28bd50db6c438ce2a15b3a003532babba0
| 11.253704 |
other
| 14 | 8 | true | false | false | false | 1.643506 | 0.202909 | 20.290939 | 0.406759 | 17.631391 | 0.046073 | 4.607251 | 0.25 | 0 | 0.375854 | 6.848437 | 0.263298 | 18.144208 | false | false |
2024-04-15
|
2024-07-17
| 0 |
beomi/gemma-mling-7b
|
beowolx_CodeNinja-1.0-OpenChat-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">beowolx/CodeNinja-1.0-OpenChat-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/beowolx__CodeNinja-1.0-OpenChat-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
beowolx/CodeNinja-1.0-OpenChat-7B
|
9934c04c767e6ae0f792712a060f02915391d4ec
| 20.347389 |
mit
| 105 | 7 | true | false | false | true | 0.636073 | 0.544677 | 54.467701 | 0.444134 | 21.713423 | 0.060423 | 6.042296 | 0.294463 | 5.928412 | 0.424323 | 11.540365 | 0.301529 | 22.392139 | false | false |
2023-12-20
|
2024-07-30
| 0 |
beowolx/CodeNinja-1.0-OpenChat-7B
|
berkeley-nest_Starling-LM-7B-alpha_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/berkeley-nest/Starling-LM-7B-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">berkeley-nest/Starling-LM-7B-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/berkeley-nest__Starling-LM-7B-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
berkeley-nest/Starling-LM-7B-alpha
|
1dddf3b95bc1391f6307299eb1c162c194bde9bd
| 20.826773 |
apache-2.0
| 555 | 7 | true | false | false | true | 0.551629 | 0.548049 | 54.804918 | 0.444007 | 21.954028 | 0.083082 | 8.308157 | 0.29698 | 6.263982 | 0.41201 | 9.501302 | 0.317154 | 24.128251 | false | true |
2023-11-25
|
2024-06-12
| 0 |
berkeley-nest/Starling-LM-7B-alpha
|
bigcode_starcoder2-15b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
Starcoder2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigcode/starcoder2-15b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoder2-15b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigcode__starcoder2-15b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigcode/starcoder2-15b
|
46d44742909c03ac8cee08eb03fdebce02e193ec
| 12.551764 |
bigcode-openrail-m
| 570 | 15 | true | false | false | false | 35.044548 | 0.278022 | 27.802231 | 0.444796 | 20.373541 | 0.060423 | 6.042296 | 0.27349 | 3.131991 | 0.350094 | 2.928385 | 0.235289 | 15.032137 | false | true |
2024-02-20
|
2024-06-09
| 0 |
bigcode/starcoder2-15b
|
bigcode_starcoder2-3b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
Starcoder2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigcode/starcoder2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoder2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigcode__starcoder2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigcode/starcoder2-3b
|
733247c55e3f73af49ce8e9c7949bf14af205928
| 6.53656 |
bigcode-openrail-m
| 152 | 3 | true | false | false | false | 0.446629 | 0.203708 | 20.370838 | 0.350871 | 8.909299 | 0.01435 | 1.435045 | 0.244128 | 0 | 0.343458 | 1.432292 | 0.163647 | 7.071882 | false | true |
2023-11-29
|
2024-06-09
| 0 |
bigcode/starcoder2-3b
|
bigcode_starcoder2-7b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
Starcoder2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigcode/starcoder2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoder2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigcode__starcoder2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigcode/starcoder2-7b
|
a3d33687b51284b528abeb17830776ffd24892a9
| 8.255674 |
bigcode-openrail-m
| 161 | 7 | true | false | false | false | 0.506401 | 0.220919 | 22.091938 | 0.366099 | 11.39511 | 0.028701 | 2.870091 | 0.251678 | 0.223714 | 0.379333 | 5.816667 | 0.164229 | 7.136525 | false | true |
2024-02-20
|
2024-06-09
| 0 |
bigcode/starcoder2-7b
|
bigscience_bloom-1b1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
BloomForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigscience/bloom-1b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-1b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-1b1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigscience/bloom-1b1
|
eb3dd7399312f5f94fd13f41d2f318117d3eb1e4
| 3.962215 |
bigscience-bloom-rail-1.0
| 62 | 1 | true | false | false | false | 0.717021 | 0.137338 | 13.733782 | 0.310728 | 4.042705 | 0.001511 | 0.151057 | 0.259228 | 1.230425 | 0.37 | 3.416667 | 0.110788 | 1.198655 | false | true |
2022-05-19
|
2024-06-13
| 0 |
bigscience/bloom-1b1
|
bigscience_bloom-1b7_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
BloomForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigscience/bloom-1b7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-1b7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-1b7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigscience/bloom-1b7
|
cc72a88036c2fb937d65efeacc57a0c2ef5d6fe5
| 3.971226 |
bigscience-bloom-rail-1.0
| 118 | 1 | true | false | false | false | 0.81836 | 0.10439 | 10.438969 | 0.314055 | 4.397453 | 0.000755 | 0.075529 | 0.258389 | 1.118568 | 0.388573 | 6.838281 | 0.108627 | 0.958555 | false | true |
2022-05-19
|
2024-06-13
| 0 |
bigscience/bloom-1b7
|
bigscience_bloom-3b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
BloomForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigscience/bloom-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigscience/bloom-3b
|
52bc5b43010b4844513826b8be3f78c7344c37d7
| 4.262013 |
bigscience-bloom-rail-1.0
| 88 | 3 | true | false | false | false | 0.996056 | 0.127096 | 12.709611 | 0.306292 | 3.420098 | 0.000755 | 0.075529 | 0.239933 | 0 | 0.398063 | 7.891146 | 0.113281 | 1.475694 | false | true |
2022-05-19
|
2024-06-13
| 0 |
bigscience/bloom-3b
|
bigscience_bloom-560m_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
BloomForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigscience/bloom-560m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-560m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-560m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigscience/bloom-560m
|
ac2ae5fab2ce3f9f40dc79b5ca9f637430d24971
| 3.456891 |
bigscience-bloom-rail-1.0
| 347 | 0 | true | false | false | false | 0.762716 | 0.062024 | 6.202432 | 0.302595 | 2.885364 | 0.000755 | 0.075529 | 0.261745 | 1.565996 | 0.403083 | 8.185417 | 0.116439 | 1.826611 | false | true |
2022-05-19
|
2024-06-13
| 0 |
bigscience/bloom-560m
|
bigscience_bloom-7b1_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
BloomForCausalLM
|
<a target="_blank" href="https://huggingface.co/bigscience/bloom-7b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-7b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-7b1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bigscience/bloom-7b1
|
6232703e399354503377bf59dfbb8397fd569e4a
| 3.707393 |
bigscience-bloom-rail-1.0
| 199 | 7 | true | false | false | false | 1.005775 | 0.132217 | 13.221696 | 0.311372 | 4.038809 | 0 | 0 | 0.264262 | 1.901566 | 0.348698 | 1.920573 | 0.110455 | 1.161717 | false | true |
2022-05-19
|
2024-06-13
| 0 |
bigscience/bloom-7b1
|
bosonai_Higgs-Llama-3-70B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/bosonai/Higgs-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bosonai/Higgs-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bosonai__Higgs-Llama-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bosonai/Higgs-Llama-3-70B
|
b2c7540768046dfdae7a0cb846a7da6c41d826b1
| 32.216234 |
other
| 215 | 70 | true | false | false | true | 13.726847 | 0.556068 | 55.60679 | 0.625766 | 45.897406 | 0.173716 | 17.371601 | 0.366611 | 15.548098 | 0.447083 | 15.51875 | 0.490193 | 43.354758 | false | false |
2024-06-05
|
2024-08-30
| 1 |
meta-llama/Meta-Llama-3-70B
|
brgx53_3Bgeneral-ECE-PRYMMAL-Martial_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/brgx53/3Bgeneral-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Bgeneral-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Bgeneral-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
brgx53/3Bgeneral-ECE-PRYMMAL-Martial
|
78ee3bde02df349ee7161f9c2a5b36161c294009
| 23.167894 |
apache-2.0
| 0 | 3 | true | false | false | false | 0.653652 | 0.328931 | 32.893057 | 0.545801 | 36.673582 | 0.124622 | 12.462236 | 0.324664 | 9.955257 | 0.437281 | 14.426823 | 0.393368 | 32.59641 | true | false |
2024-10-23
|
2024-10-23
| 1 |
brgx53/3Bgeneral-ECE-PRYMMAL-Martial (Merge)
|
brgx53_3Bgeneralv2-ECE-PRYMMAL-Martial_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Bgeneralv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial
|
8525f801c47b2bce2ca4dad360ce71b2cb6b370b
| 30.777462 |
apache-2.0
| 0 | 3 | true | false | false | false | 1.341162 | 0.567708 | 56.770813 | 0.56072 | 37.250633 | 0.307402 | 30.740181 | 0.311242 | 8.165548 | 0.435635 | 12.78776 | 0.450549 | 38.949837 | true | false |
2024-11-08
|
2024-11-08
| 1 |
brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial (Merge)
|
brgx53_3Blareneg-ECE-PRYMMAL-Martial_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/brgx53/3Blareneg-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Blareneg-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Blareneg-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
brgx53/3Blareneg-ECE-PRYMMAL-Martial
|
abac4757125a66a427fb82751bf171dabaea3458
| 21.33386 |
apache-2.0
| 0 | 3 | true | false | false | false | 0.807534 | 0.287639 | 28.763902 | 0.535846 | 35.452586 | 0.035498 | 3.549849 | 0.334732 | 11.297539 | 0.442896 | 15.428646 | 0.401596 | 33.510638 | true | false |
2024-10-23
|
2024-10-23
| 1 |
brgx53/3Blareneg-ECE-PRYMMAL-Martial (Merge)
|
brgx53_3Blarenegv2-ECE-PRYMMAL-Martial_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/brgx53/3Blarenegv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Blarenegv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Blarenegv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
brgx53/3Blarenegv2-ECE-PRYMMAL-Martial
|
304038fc2b2527e31c738f9091206253a0d40f6c
| 30.752066 |
apache-2.0
| 0 | 7 | true | false | false | false | 0.686896 | 0.566184 | 56.618439 | 0.56072 | 37.250633 | 0.307402 | 30.740181 | 0.311242 | 8.165548 | 0.435635 | 12.78776 | 0.450549 | 38.949837 | true | false |
2024-11-08
|
2024-11-08
| 1 |
brgx53/3Blarenegv2-ECE-PRYMMAL-Martial (Merge)
|
bunnycore_Best-Mix-Llama-3.1-8B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Best-Mix-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Best-Mix-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Best-Mix-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Best-Mix-Llama-3.1-8B
|
4bde0e60ac20d6944b1fbdfb3456efea8ba59ae9
| 8.624959 |
apache-2.0
| 0 | 8 | true | false | false | false | 0.905003 | 0.206706 | 20.670598 | 0.343178 | 7.255276 | 0.14426 | 14.425982 | 0.265101 | 2.013423 | 0.292854 | 1.106771 | 0.156499 | 6.277704 | true | false |
2024-10-10
|
2024-10-10
| 0 |
bunnycore/Best-Mix-Llama-3.1-8B
|
bunnycore_CyberCore-Qwen-2.1-7B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/CyberCore-Qwen-2.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/CyberCore-Qwen-2.1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__CyberCore-Qwen-2.1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/CyberCore-Qwen-2.1-7B
|
98e69ba1cd70444b90178e1253e904d1892593c8
| 27.24554 | 2 | 7 | false | false | false | true | 0.672684 | 0.576576 | 57.657571 | 0.557209 | 36.966533 | 0.134441 | 13.444109 | 0.307886 | 7.718121 | 0.41449 | 9.411198 | 0.444481 | 38.275709 | false | false |
2024-11-21
|
2024-11-23
| 1 |
bunnycore/CyberCore-Qwen-2.1-7B (Merge)
|
|
bunnycore_HyperLlama-3.1-8B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/HyperLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/HyperLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__HyperLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/HyperLlama-3.1-8B
|
659b18ffaee2c1e8dbe8a9a56a44502325d71696
| 28.398623 |
apache-2.0
| 4 | 8 | true | false | false | true | 0.894545 | 0.788301 | 78.83006 | 0.510339 | 29.806656 | 0.179758 | 17.975831 | 0.286913 | 4.9217 | 0.382927 | 7.932552 | 0.378324 | 30.924941 | true | false |
2024-09-04
|
2024-09-05
| 0 |
bunnycore/HyperLlama-3.1-8B
|
bunnycore_Llama-3.1-8B-TitanFusion-Mix_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.1-8B-TitanFusion-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.1-8B-TitanFusion-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.1-8B-TitanFusion-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.1-8B-TitanFusion-Mix
|
9eb89de7df048276ccbc4405ce4f005f9185f82e
| 24.961895 | 2 | 8 | false | false | false | false | 0.933085 | 0.492495 | 49.249547 | 0.575596 | 39.535483 | 0.125378 | 12.537764 | 0.295302 | 6.040268 | 0.431698 | 12.46224 | 0.369515 | 29.94607 | false | false |
2024-09-23
|
2024-09-23
| 1 |
bunnycore/Llama-3.1-8B-TitanFusion-Mix (Merge)
|
|
bunnycore_Llama-3.1-8B-TitanFusion-v3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.1-8B-TitanFusion-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.1-8B-TitanFusion-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.1-8B-TitanFusion-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.1-8B-TitanFusion-v3
|
ea8269ac3b2e9c0dc855a9089251ebdb273ada16
| 24.231721 | 2 | 8 | false | false | false | false | 0.887824 | 0.480955 | 48.095498 | 0.526211 | 32.072941 | 0.142749 | 14.274924 | 0.308725 | 7.829978 | 0.430208 | 11.942708 | 0.380568 | 31.174276 | false | false |
2024-09-22
|
2024-09-22
| 1 |
bunnycore/Llama-3.1-8B-TitanFusion-v3 (Merge)
|
|
bunnycore_Llama-3.2-3B-All-Mix_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-All-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-All-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-All-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.2-3B-All-Mix
|
adacdd571c4073990ecf05a23277793e9e5f0410
| 22.416478 | 1 | 3 | false | false | false | true | 0.740309 | 0.722605 | 72.260491 | 0.450834 | 22.516311 | 0.11858 | 11.858006 | 0.262584 | 1.677852 | 0.328698 | 2.18724 | 0.315991 | 23.998966 | false | false |
2024-10-20
|
2024-10-20
| 1 |
bunnycore/Llama-3.2-3B-All-Mix (Merge)
|
|
bunnycore_Llama-3.2-3B-Booval_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Booval" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Booval</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Booval-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.2-3B-Booval
|
d7f3449f89fa86d8e2c411aa4ca10ad552a62803
| 21.3011 | 2 | 3 | false | false | false | true | 0.655241 | 0.666926 | 66.692598 | 0.451439 | 22.515991 | 0.111027 | 11.102719 | 0.266779 | 2.237136 | 0.339427 | 2.395052 | 0.305768 | 22.863106 | false | false |
2024-10-27
|
2024-10-28
| 1 |
bunnycore/Llama-3.2-3B-Booval (Merge)
|
|
bunnycore_Llama-3.2-3B-Long-Think_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Long-Think" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Long-Think</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Long-Think-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.2-3B-Long-Think
|
a8522bfc03657b41b0541b164a98ddff302a6fd2
| 19.549052 | 1 | 3 | false | false | false | true | 1.381029 | 0.54735 | 54.734992 | 0.461039 | 24.226803 | 0.129154 | 12.915408 | 0.260906 | 1.454139 | 0.339552 | 1.210677 | 0.304771 | 22.75229 | false | false |
2024-10-24
|
2024-10-24
| 1 |
bunnycore/Llama-3.2-3B-Long-Think (Merge)
|
|
bunnycore_Llama-3.2-3B-Mix-Skill_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Mix-Skill" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Mix-Skill</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Mix-Skill-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.2-3B-Mix-Skill
|
d07d6e733aaeaf48cb6616228d00104b05b52afd
| 21.399687 | 1 | 3 | false | false | false | true | 0.685067 | 0.640423 | 64.042297 | 0.458184 | 23.784247 | 0.126888 | 12.688822 | 0.261745 | 1.565996 | 0.339615 | 2.751823 | 0.312084 | 23.564938 | false | false |
2024-10-24
|
2024-10-24
| 1 |
bunnycore/Llama-3.2-3B-Mix-Skill (Merge)
|
|
bunnycore_Llama-3.2-3B-ProdigyPlus_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-ProdigyPlus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-ProdigyPlus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-ProdigyPlus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.2-3B-ProdigyPlus
|
799f7669701ecf27f4c3e29998dd839b4d54c408
| 16.079069 | 2 | 3 | false | false | false | true | 0.71394 | 0.40152 | 40.152019 | 0.439228 | 20.622989 | 0.098943 | 9.89426 | 0.268456 | 2.46085 | 0.358 | 3.15 | 0.281749 | 20.194297 | false | false |
2024-10-25
|
2024-10-25
| 1 |
bunnycore/Llama-3.2-3B-ProdigyPlus (Merge)
|
|
bunnycore_Llama-3.2-3B-ProdigyPlusPlus_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-ProdigyPlusPlus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-ProdigyPlusPlus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-ProdigyPlusPlus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Llama-3.2-3B-ProdigyPlusPlus
|
512865708a7ec9754997fb404b1ffc0752b099d7
| 6.443826 | 0 | 3 | false | false | false | true | 0.671576 | 0.164516 | 16.451571 | 0.368993 | 11.561978 | 0.029456 | 2.945619 | 0.253356 | 0.447427 | 0.354125 | 1.698958 | 0.150017 | 5.557402 | false | false |
2024-10-28
|
2024-10-28
| 1 |
bunnycore/Llama-3.2-3B-ProdigyPlusPlus (Merge)
|
|
bunnycore_Phi-3.5-mini-TitanFusion-0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-3.5-mini-TitanFusion-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-3.5-mini-TitanFusion-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-3.5-mini-TitanFusion-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Phi-3.5-mini-TitanFusion-0.1
|
72939b8b75e23b22b1758bb05a842e5834f75d96
| 25.392388 | 0 | 3 | false | false | false | true | 0.796885 | 0.522795 | 52.279507 | 0.537373 | 35.446219 | 0.067976 | 6.797583 | 0.331376 | 10.850112 | 0.445313 | 15.797396 | 0.380652 | 31.183511 | false | false |
2024-10-13
|
2024-10-13
| 1 |
bunnycore/Phi-3.5-mini-TitanFusion-0.1 (Merge)
|
|
bunnycore_Qandora-2.5-7B-Creative_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Qandora-2.5-7B-Creative" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qandora-2.5-7B-Creative</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qandora-2.5-7B-Creative-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Qandora-2.5-7B-Creative
|
fdb174364d4a4f323ed1cb01219ac4d87708219d
| 30.931131 | 1 | 7 | false | false | false | true | 0.709795 | 0.680315 | 68.03149 | 0.554176 | 36.424652 | 0.23565 | 23.564955 | 0.310403 | 8.053691 | 0.421188 | 10.848438 | 0.447972 | 38.663564 | false | false |
2024-11-20
|
2024-11-20
| 1 |
bunnycore/Qandora-2.5-7B-Creative (Merge)
|
|
bunnycore_QandoraExp-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/QandoraExp-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QandoraExp-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QandoraExp-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/QandoraExp-7B
|
74906d5518c7feb7df9b168763dabc1b0167942f
| 28.51072 | 1 | 7 | false | false | false | true | 0.670528 | 0.750906 | 75.090648 | 0.547796 | 35.924742 | 0.009063 | 0.906344 | 0.310403 | 8.053691 | 0.431208 | 13.201042 | 0.440991 | 37.887855 | false | false |
2024-11-11
|
2024-11-11
| 1 |
bunnycore/QandoraExp-7B (Merge)
|
|
bunnycore_QandoraExp-7B-Persona_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/QandoraExp-7B-Persona" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QandoraExp-7B-Persona</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QandoraExp-7B-Persona-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/QandoraExp-7B-Persona
|
21bd6c2e270358b70f9af98bcccd6ec9c2cfce88
| 29.188506 | 2 | 7 | false | false | false | true | 0.687971 | 0.624686 | 62.468583 | 0.555834 | 36.832709 | 0.160121 | 16.012085 | 0.314597 | 8.612975 | 0.437156 | 13.344531 | 0.440741 | 37.860151 | false | false |
2024-11-12
|
2024-11-12
| 1 |
bunnycore/QandoraExp-7B-Persona (Merge)
|
|
bunnycore_QandoraExp-7B-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/QandoraExp-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QandoraExp-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QandoraExp-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/QandoraExp-7B-v2
|
017594240f9b3c4262e23de6d550453a1a3d5540
| 23.274654 | 1 | 7 | false | false | false | true | 0.693649 | 0.560689 | 56.068897 | 0.544486 | 34.944967 | 0 | 0 | 0.302852 | 7.04698 | 0.404542 | 9.267708 | 0.390874 | 32.319371 | false | false |
2024-11-12
|
2024-11-12
| 1 |
bunnycore/QandoraExp-7B-v2 (Merge)
|
|
bunnycore_Qwen2.5-3B-RP-Mix_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-RP-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-RP-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-RP-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Qwen2.5-3B-RP-Mix
|
0e8f3b56f9270fdcdd4badfd7b925dc8fc4902c7
| 23.377815 | 2 | 3 | false | false | false | true | 0.919701 | 0.572054 | 57.205437 | 0.489438 | 28.305923 | 0.087613 | 8.761329 | 0.27349 | 3.131991 | 0.428448 | 12.55599 | 0.372756 | 30.30622 | false | false |
2024-10-22
|
2024-10-22
| 1 |
bunnycore/Qwen2.5-3B-RP-Mix (Merge)
|
|
bunnycore_Qwen2.5-7B-CyberRombos_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-CyberRombos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-CyberRombos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-CyberRombos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Qwen2.5-7B-CyberRombos
|
dfd4d30fc6956ffecb9fb3c59fad51875552f7f9
| 27.692747 | 2 | 7 | false | false | false | true | 0.710105 | 0.751831 | 75.18307 | 0.546496 | 35.884025 | 0.000755 | 0.075529 | 0.30453 | 7.270694 | 0.412542 | 10.067708 | 0.439079 | 37.675458 | false | false |
2024-11-04
|
2024-11-05
| 1 |
bunnycore/Qwen2.5-7B-CyberRombos (Merge)
|
|
bunnycore_Qwen2.5-7B-Instruct-Fusion_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-Instruct-Fusion" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-Instruct-Fusion</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-Instruct-Fusion-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Qwen2.5-7B-Instruct-Fusion
|
6313c0b3de799ab48720c3b828e322a77cf8d023
| 30.747252 | 3 | 7 | false | false | false | true | 0.66295 | 0.696202 | 69.620163 | 0.54919 | 36.179859 | 0.199396 | 19.939577 | 0.30453 | 7.270694 | 0.429719 | 12.948177 | 0.446725 | 38.525044 | false | false |
2024-10-31
|
2024-11-02
| 1 |
bunnycore/Qwen2.5-7B-Instruct-Fusion (Merge)
|
|
bunnycore_QwenMosaic-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/QwenMosaic-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QwenMosaic-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QwenMosaic-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/QwenMosaic-7B
|
1eab0bbe701195ba26f60a284f74e3c6dfe5c139
| 25.308428 | 1 | 7 | false | false | false | true | 0.750287 | 0.581922 | 58.192152 | 0.556413 | 36.75052 | 0.084592 | 8.459215 | 0.260906 | 1.454139 | 0.416385 | 10.214844 | 0.431017 | 36.779699 | false | false |
2024-12-01
|
2024-12-02
| 1 |
bunnycore/QwenMosaic-7B (Merge)
|
|
bunnycore_SmolLM2-1.7-Persona_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/SmolLM2-1.7-Persona" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/SmolLM2-1.7-Persona</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__SmolLM2-1.7-Persona-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/SmolLM2-1.7-Persona
|
ebeaa6f284c044bd54e3e66cc5458d974d92523e
| 14.25041 |
apache-2.0
| 0 | 1 | true | false | false | true | 0.331332 | 0.546525 | 54.652544 | 0.362321 | 11.203753 | 0.04003 | 4.003021 | 0.263423 | 1.789709 | 0.334125 | 3.032292 | 0.19739 | 10.821144 | true | false |
2024-11-15
|
2024-11-15
| 0 |
bunnycore/SmolLM2-1.7-Persona
|
bunnycore_SmolLM2-1.7B-roleplay-lora_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/bunnycore/SmolLM2-1.7B-roleplay-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/SmolLM2-1.7B-roleplay-lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__SmolLM2-1.7B-roleplay-lora-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/SmolLM2-1.7B-roleplay-lora
|
bbab860a4ffdd8e48f600192947ad3504bb0a944
| 14.252474 |
apache-2.0
| 0 | 3 | true | false | false | true | 0.700026 | 0.538208 | 53.820751 | 0.361034 | 10.907238 | 0.039275 | 3.927492 | 0.275168 | 3.355705 | 0.339458 | 2.765625 | 0.196642 | 10.738032 | false | false |
2024-11-15
|
2024-11-15
| 3 |
HuggingFaceTB/SmolLM2-1.7B-Instruct (Merge)
|
bunnycore_Tulu-3.1-8B-SuperNova_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/bunnycore/Tulu-3.1-8B-SuperNova" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Tulu-3.1-8B-SuperNova</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Tulu-3.1-8B-SuperNova-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
bunnycore/Tulu-3.1-8B-SuperNova
|
bbbfb910ca8d8f7ae35ecaf4824ad68713bf8d86
| 30.941023 | 3 | 8 | false | false | false | true | 0.693924 | 0.819375 | 81.937481 | 0.525412 | 32.499171 | 0.243202 | 24.320242 | 0.302013 | 6.935123 | 0.3935 | 8.6875 | 0.3814 | 31.266622 | false | false |
2024-11-22
|
2024-11-23
| 1 |
bunnycore/Tulu-3.1-8B-SuperNova (Merge)
|
|
byroneverson_Mistral-Small-Instruct-2409-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/byroneverson/Mistral-Small-Instruct-2409-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">byroneverson/Mistral-Small-Instruct-2409-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/byroneverson__Mistral-Small-Instruct-2409-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
byroneverson/Mistral-Small-Instruct-2409-abliterated
|
5e24aaef2a37f9cb69f70ae9fe714f9d9599fd6e
| 27.169391 |
other
| 10 | 22 | true | false | false | true | 1.402286 | 0.697076 | 69.707598 | 0.523786 | 31.2557 | 0.149547 | 14.954683 | 0.333054 | 11.073826 | 0.369719 | 3.548177 | 0.392287 | 32.476359 | false | false |
2024-09-23
|
2024-10-13
| 1 |
mistralai/Mistral-Small-Instruct-2409
|
byroneverson_Yi-1.5-9B-Chat-16K-abliterated_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/byroneverson/Yi-1.5-9B-Chat-16K-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">byroneverson/Yi-1.5-9B-Chat-16K-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/byroneverson__Yi-1.5-9B-Chat-16K-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
byroneverson/Yi-1.5-9B-Chat-16K-abliterated
|
84a6eaa723633bbefc7cfac9c64bf0e0a4d39065
| 26.532728 |
apache-2.0
| 4 | 8 | true | false | false | true | 1.090103 | 0.552845 | 55.284534 | 0.528205 | 32.843259 | 0.116314 | 11.63142 | 0.312919 | 8.389262 | 0.473438 | 19.679687 | 0.382314 | 31.368203 | false | false |
2024-09-03
|
2024-09-03
| 1 |
01-ai/Yi-1.5-9B-Chat-16K
|
byroneverson_Yi-1.5-9B-Chat-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/byroneverson/Yi-1.5-9B-Chat-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">byroneverson/Yi-1.5-9B-Chat-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/byroneverson__Yi-1.5-9B-Chat-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
byroneverson/Yi-1.5-9B-Chat-abliterated
|
4e26c200cdf2dc50dd50cdd9fe5b74887e9fa94a
| 25.300721 |
apache-2.0
| 2 | 8 | true | false | false | true | 0.844572 | 0.572329 | 57.23292 | 0.540122 | 34.352187 | 0.108006 | 10.800604 | 0.291946 | 5.592841 | 0.438865 | 13.658073 | 0.371509 | 30.167701 | false | false |
2024-09-04
|
2024-09-17
| 1 |
01-ai/Yi-1.5-9B-Chat
|
c10x_Q-Pluse_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/c10x/Q-Pluse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">c10x/Q-Pluse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/c10x__Q-Pluse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
c10x/Q-Pluse
| 3.634371 | 0 | 7 | false | false | false | true | 1.311691 | 0.112283 | 11.228319 | 0.287511 | 1.947945 | 0 | 0 | 0.246644 | 0 | 0.393812 | 7.126563 | 0.113531 | 1.503398 | false | false |
2024-10-10
| 0 |
Removed
|
|||
c10x_longthinker_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/c10x/longthinker" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">c10x/longthinker</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/c10x__longthinker-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
c10x/longthinker
|
e1bb4a2c2782ab52be7a8fa2e5905f08b7cfd464
| 19.686074 | 0 | 8 | false | false | false | true | 0.944774 | 0.360879 | 36.087913 | 0.492749 | 28.424737 | 0.169184 | 16.918429 | 0.264262 | 1.901566 | 0.390958 | 6.703125 | 0.352726 | 28.080674 | false | false |
2024-10-10
|
2024-10-10
| 1 |
c10x/longthinker (Merge)
|
|
carsenk_flippa-v6_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/carsenk/flippa-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">carsenk/flippa-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/carsenk__flippa-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
carsenk/flippa-v6
|
5206a32e0bd3067aef1ce90f5528ade7d866253f
| 20.738603 |
llama3.1
| 1 | 16 | true | false | false | false | 1.0648 | 0.343943 | 34.394296 | 0.504697 | 29.993501 | 0.138218 | 13.821752 | 0.292785 | 5.704698 | 0.408875 | 10.876042 | 0.366772 | 29.641327 | false | false |
2024-08-24
|
2024-08-24
| 2 |
meta-llama/Meta-Llama-3.1-8B
|
carsenk_phi3.5_mini_exp_825_uncensored_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/carsenk/phi3.5_mini_exp_825_uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">carsenk/phi3.5_mini_exp_825_uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/carsenk__phi3.5_mini_exp_825_uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
carsenk/phi3.5_mini_exp_825_uncensored
|
6b208dc3df02e0d5ef0c3fe5899f9f31618f2e94
| 3.466875 |
apache-2.0
| 2 | 3 | true | false | false | true | 0.487818 | 0.136414 | 13.64136 | 0.296473 | 1.827813 | 0 | 0 | 0.249161 | 0 | 0.364417 | 3.385417 | 0.11752 | 1.946661 | false | false |
2024-08-29
|
2024-08-29
| 2 |
microsoft/Phi-3.5-mini-instruct
|
cat-searcher_gemma-2-9b-it-sppo-iter-1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/cat-searcher/gemma-2-9b-it-sppo-iter-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cat-searcher/gemma-2-9b-it-sppo-iter-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cat-searcher__gemma-2-9b-it-sppo-iter-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cat-searcher/gemma-2-9b-it-sppo-iter-1
|
b29a3a5cef93ee044e2297fcb40bd2976415e900
| 20.553948 | 0 | 9 | false | false | false | true | 2.768039 | 0.301477 | 30.147675 | 0.597187 | 41.676308 | 0 | 0 | 0.344799 | 12.639821 | 0.392667 | 7.15 | 0.385389 | 31.709885 | false | false |
2024-08-09
| 0 |
Removed
|
||
cat-searcher_gemma-2-9b-it-sppo-iter-1-evol-1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/cat-searcher/gemma-2-9b-it-sppo-iter-1-evol-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cat-searcher/gemma-2-9b-it-sppo-iter-1-evol-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cat-searcher__gemma-2-9b-it-sppo-iter-1-evol-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cat-searcher/gemma-2-9b-it-sppo-iter-1-evol-1
|
c2d7b76786151aecfa5972a2a3e937feb2d2c48b
| 20.103006 | 0 | 9 | false | false | false | true | 2.787812 | 0.294183 | 29.418277 | 0.593937 | 41.10464 | 0 | 0 | 0.340604 | 12.080537 | 0.392573 | 6.904948 | 0.379987 | 31.109634 | false | false |
2024-08-09
| 0 |
Removed
|
||
cgato_TheSalt-L3-8b-v0.3.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cgato/TheSalt-L3-8b-v0.3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cgato/TheSalt-L3-8b-v0.3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cgato__TheSalt-L3-8b-v0.3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cgato/TheSalt-L3-8b-v0.3.2
|
5cf08e2bf9590ebcd14ba021e113def28c65afa2
| 7.248832 |
cc-by-nc-4.0
| 1 | 8 | true | false | false | true | 0.940294 | 0.270503 | 27.050338 | 0.296797 | 2.612714 | 0.03852 | 3.851964 | 0.26594 | 2.12528 | 0.389625 | 6.303125 | 0.113946 | 1.549572 | false | false |
2024-06-18
|
2024-06-26
| 0 |
cgato/TheSalt-L3-8b-v0.3.2
|
chargoddard_prometheus-2-llama-3-8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/chargoddard/prometheus-2-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chargoddard/prometheus-2-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chargoddard__prometheus-2-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
chargoddard/prometheus-2-llama-3-8b
|
90a728ac98e5b4169f88ae4945e357cf45477568
| 19.281097 |
apache-2.0
| 2 | 8 | true | false | false | true | 0.945115 | 0.52889 | 52.889001 | 0.493114 | 27.803839 | 0.08006 | 8.006042 | 0.272651 | 3.020134 | 0.339583 | 0.78125 | 0.308677 | 23.186318 | true | false |
2024-05-26
|
2024-06-26
| 1 |
chargoddard/prometheus-2-llama-3-8b (Merge)
|
chujiezheng_Llama-3-Instruct-8B-SimPO-ExPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chujiezheng__Llama-3-Instruct-8B-SimPO-ExPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO
|
3fcaa9fe99691659eb197487e9a343f601bf63f2
| 21.972344 |
llama3
| 16 | 8 | true | false | false | true | 0.720481 | 0.643371 | 64.33707 | 0.476452 | 25.868282 | 0.005287 | 0.528701 | 0.286913 | 4.9217 | 0.39201 | 9.501302 | 0.340093 | 26.677009 | false | false |
2024-05-26
|
2024-06-26
| 0 |
chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO
|
chujiezheng_Mistral7B-PairRM-SPPO-ExPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/chujiezheng/Mistral7B-PairRM-SPPO-ExPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chujiezheng/Mistral7B-PairRM-SPPO-ExPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chujiezheng__Mistral7B-PairRM-SPPO-ExPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
chujiezheng/Mistral7B-PairRM-SPPO-ExPO
|
d3e8342a63e5ae096f450f2467a92168db12768c
| 13.491268 |
apache-2.0
| 0 | 7 | true | false | false | true | 0.509034 | 0.367349 | 36.734863 | 0.388219 | 13.678636 | 0.010574 | 1.057402 | 0.276846 | 3.579418 | 0.405531 | 8.658073 | 0.255153 | 17.239214 | false | false |
2024-05-04
|
2024-09-21
| 0 |
chujiezheng/Mistral7B-PairRM-SPPO-ExPO
|
cloudyu_Llama-3-70Bx2-MOE_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cloudyu/Llama-3-70Bx2-MOE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Llama-3-70Bx2-MOE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Llama-3-70Bx2-MOE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cloudyu/Llama-3-70Bx2-MOE
|
b8bd85e8db8e4ec352b93441c92e0ae1334bf5a7
| 35.666465 |
llama3
| 1 | 126 | true | true | false | false | 21.539555 | 0.548249 | 54.824865 | 0.663623 | 51.422138 | 0.217523 | 21.752266 | 0.393456 | 19.127517 | 0.481188 | 20.848437 | 0.514212 | 46.023567 | false | false |
2024-05-20
|
2024-06-27
| 0 |
cloudyu/Llama-3-70Bx2-MOE
|
cloudyu_Mixtral_34Bx2_MoE_60B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cloudyu/Mixtral_34Bx2_MoE_60B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Mixtral_34Bx2_MoE_60B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Mixtral_34Bx2_MoE_60B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cloudyu/Mixtral_34Bx2_MoE_60B
|
d01642769ccc782e1db1fc26cb25097aecb98e23
| 27.598581 |
apache-2.0
| 111 | 60 | true | true | false | false | 7.332588 | 0.453777 | 45.377709 | 0.58697 | 41.209129 | 0.076284 | 7.628399 | 0.338087 | 11.744966 | 0.462521 | 17.781771 | 0.476646 | 41.849512 | false | false |
2024-01-05
|
2024-08-22
| 0 |
cloudyu/Mixtral_34Bx2_MoE_60B
|
cloudyu_Yi-34Bx2-MoE-60B-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Yi-34Bx2-MoE-60B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Yi-34Bx2-MoE-60B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cloudyu/Yi-34Bx2-MoE-60B-DPO
|
5c2d31042229ee06246064100b781dd926cb0ffd
| 26.043502 |
apache-2.0
| 3 | 60 | true | true | false | true | 7.339247 | 0.531888 | 53.188761 | 0.516831 | 31.259298 | 0.070242 | 7.024169 | 0.322148 | 9.619687 | 0.437469 | 14.316927 | 0.46767 | 40.852172 | false | false |
2024-01-23
|
2024-08-06
| 0 |
cloudyu/Yi-34Bx2-MoE-60B-DPO
|
cluebbers_Llama-3.1-8B-paraphrase-type-generation-apty-ipo_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cluebbers__Llama-3.1-8B-paraphrase-type-generation-apty-ipo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo
|
eb04613997875935cb667a517e518874bb716169
| 9.749648 |
apache-2.0
| 0 | 8 | true | false | false | false | 0.719831 | 0.132667 | 13.266688 | 0.380022 | 12.669478 | 0.006798 | 0.679758 | 0.263423 | 1.789709 | 0.433219 | 12.41901 | 0.259059 | 17.673242 | false | false |
2024-11-14
|
2024-11-15
| 1 |
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo (Merge)
|
cluebbers_Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cluebbers__Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid
|
2c8b52e8db11a6ff57cccf890ee26688e858f9fb
| 9.743408 |
apache-2.0
| 0 | 8 | true | false | false | false | 0.72356 | 0.131842 | 13.18424 | 0.37889 | 12.757325 | 0.006798 | 0.679758 | 0.268456 | 2.46085 | 0.430552 | 12.01901 | 0.256233 | 17.359264 | false | false |
2024-11-15
|
2024-11-15
| 1 |
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid (Merge)
|
cluebbers_Llama-3.1-8B-paraphrase-type-generation-etpc_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cluebbers__Llama-3.1-8B-paraphrase-type-generation-etpc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc
|
a003a227aed5c1ad67cd4a653b13a0dd7acb7ed5
| 9.430026 |
apache-2.0
| 0 | 8 | true | false | false | false | 0.741527 | 0.120852 | 12.085156 | 0.378081 | 12.694579 | 0.004532 | 0.453172 | 0.265101 | 2.013423 | 0.431854 | 12.048437 | 0.255568 | 17.285387 | false | false |
2024-11-04
|
2024-11-15
| 1 |
cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc (Merge)
|
cognitivecomputations_dolphin-2.9-llama3-8b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9-llama3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9-llama3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9-llama3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9-llama3-8b
|
5aeb036f9215c558b483a654a8c6e1cc22e841bf
| 18.390285 |
other
| 422 | 8 | true | false | false | true | 0.73912 | 0.385034 | 38.503393 | 0.494992 | 27.858929 | 0.055891 | 5.589124 | 0.286913 | 4.9217 | 0.437531 | 13.791406 | 0.277094 | 19.677157 | false | true |
2024-04-20
|
2024-06-12
| 1 |
meta-llama/Meta-Llama-3-8B
|
cognitivecomputations_dolphin-2.9.1-llama-3-70b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-llama-3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-llama-3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-llama-3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.1-llama-3-70b
|
31adf616c3c9176d147e0a62e9fedb7bf97678ac
| 23.444759 |
llama3
| 40 | 70 | true | false | false | true | 12.149088 | 0.376017 | 37.601675 | 0.520492 | 31.101152 | 0.056647 | 5.664653 | 0.308725 | 7.829978 | 0.497562 | 23.695312 | 0.412982 | 34.775783 | false | true |
2024-05-22
|
2024-06-27
| 1 |
meta-llama/Meta-Llama-3-70B
|
cognitivecomputations_dolphin-2.9.1-yi-1.5-34b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-yi-1.5-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-yi-1.5-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-yi-1.5-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.1-yi-1.5-34b
|
1ec522298a6935c881df6dc29d3669833bd8672d
| 27.904384 |
apache-2.0
| 34 | 34 | true | false | false | true | 2.992653 | 0.385259 | 38.525889 | 0.607623 | 44.174089 | 0.162387 | 16.238671 | 0.343121 | 12.416107 | 0.459792 | 16.973958 | 0.451878 | 39.097592 | false | true |
2024-05-18
|
2024-07-27
| 1 |
01-ai/Yi-1.5-34B
|
cognitivecomputations_dolphin-2.9.1-yi-1.5-9b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-yi-1.5-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-yi-1.5-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-yi-1.5-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.1-yi-1.5-9b
|
91f0a521e3e2a0675a3549aa5d3f40717068de94
| 24.93479 |
apache-2.0
| 26 | 8 | true | false | false | true | 1.050866 | 0.446533 | 44.653298 | 0.548431 | 35.77609 | 0.109517 | 10.951662 | 0.338087 | 11.744966 | 0.434802 | 13.516927 | 0.396692 | 32.965795 | false | true |
2024-05-18
|
2024-08-02
| 1 |
01-ai/Yi-1.5-9B
|
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium
|
0470c5b912b51fa6e27d87a8ea7feafacd8cb101
| 25.668897 |
mit
| 18 | -1 | true | false | false | true | 0.840482 | 0.424776 | 42.477626 | 0.645674 | 49.72194 | 0.006042 | 0.60423 | 0.327181 | 10.290828 | 0.419052 | 11.414844 | 0.455535 | 39.503915 | false | true |
2024-05-31
|
2024-08-05
| 1 |
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium (Merge)
|
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium-abliterated_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated
|
d50be5f22ca9745a2a3175996611d6a840318b7f
| 25.590064 |
mit
| 16 | 13 | true | false | false | false | 0.843954 | 0.361254 | 36.12537 | 0.612323 | 45.441267 | 0.123867 | 12.386707 | 0.32802 | 10.402685 | 0.411177 | 10.363802 | 0.449385 | 38.820553 | false | true |
2024-06-03
|
2024-06-27
| 1 |
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated (Merge)
|
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated
|
d50be5f22ca9745a2a3175996611d6a840318b7f
| 25.618429 |
mit
| 16 | 13 | true | false | false | true | 0.820797 | 0.412361 | 41.236142 | 0.638289 | 48.385347 | 0.006798 | 0.679758 | 0.328859 | 10.514541 | 0.434927 | 13.732552 | 0.45246 | 39.162234 | false | true |
2024-06-03
|
2024-08-05
| 1 |
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated (Merge)
|
cognitivecomputations_dolphin-2.9.2-qwen2-72b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.2-qwen2-72b
|
e79582577c2bf2af304221af0e8308b7e7d46ca1
| 35.745292 |
other
| 123 | 72 | true | false | false | true | 25.115554 | 0.634378 | 63.43779 | 0.629636 | 47.696174 | 0.206193 | 20.619335 | 0.369966 | 15.995526 | 0.452073 | 17.042448 | 0.547124 | 49.680482 | false | true |
2024-05-27
|
2024-10-20
| 1 |
Qwen/Qwen2-72B
|
cognitivecomputations_dolphin-2.9.2-qwen2-7b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.2-qwen2-7b
|
c443c4eb5138ed746ac49ed98bf3c183dc5380ac
| 21.183965 |
apache-2.0
| 64 | 7 | true | false | false | true | 1.279197 | 0.35346 | 35.345993 | 0.489383 | 27.914875 | 0.129154 | 12.915408 | 0.290268 | 5.369128 | 0.419146 | 11.659896 | 0.405086 | 33.898493 | false | true |
2024-05-24
|
2024-07-10
| 1 |
Qwen/Qwen2-7B
|
cognitivecomputations_dolphin-2.9.3-Yi-1.5-34B-32k_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.3-Yi-1.5-34B-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.3-Yi-1.5-34B-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.3-Yi-1.5-34B-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.3-Yi-1.5-34B-32k
|
ff4eee6438194a670a95dff3118b5231eb568610
| 27.073206 |
apache-2.0
| 18 | 34 | true | false | false | true | 3.245261 | 0.363927 | 36.39266 | 0.6047 | 43.406476 | 0.165408 | 16.540785 | 0.343121 | 12.416107 | 0.431052 | 13.348177 | 0.463015 | 40.335033 | false | true |
2024-06-23
|
2024-07-27
| 1 |
01-ai/Yi-1.5-34B-32k
|
cognitivecomputations_dolphin-2.9.3-mistral-7B-32k_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.3-mistral-7B-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.3-mistral-7B-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.3-mistral-7B-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.3-mistral-7B-32k
|
4f4273ee8e7930dd64e2c6121c79d12546b883e2
| 19.373872 |
apache-2.0
| 46 | 7 | true | false | false | true | 0.600083 | 0.412636 | 41.263625 | 0.481254 | 26.906354 | 0.052115 | 5.21148 | 0.285235 | 4.697987 | 0.46426 | 17.932552 | 0.282081 | 20.231235 | false | true |
2024-06-25
|
2024-07-04
| 1 |
mistralai/Mistral-7B-v0.3
|
cognitivecomputations_dolphin-2.9.3-mistral-nemo-12b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.3-mistral-nemo-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b
|
7b535c900688fc836fbeebaeb7133910b09bafda
| 24.682904 |
apache-2.0
| 86 | 12 | true | false | false | true | 1.375142 | 0.560089 | 56.008945 | 0.548037 | 36.082759 | 0.056647 | 5.664653 | 0.315436 | 8.724832 | 0.44299 | 15.207031 | 0.337683 | 26.409205 | false | true |
2024-07-23
|
2024-07-26
| 1 |
mistralai/Mistral-Nemo-Base-2407
|
cognitivecomputations_dolphin-2.9.4-gemma2-2b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.4-gemma2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.4-gemma2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.4-gemma2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.4-gemma2-2b
|
5c0854beb88a6711221771d1b13d51f733e6ca06
| 9.797441 |
gemma
| 33 | 2 | true | false | false | true | 1.511248 | 0.089551 | 8.955128 | 0.408132 | 17.367633 | 0.046828 | 4.682779 | 0.284396 | 4.58613 | 0.417969 | 10.91276 | 0.210522 | 12.280216 | false | true |
2024-08-24
|
2024-08-25
| 1 |
google/gemma-2-2b
|
cognitivecomputations_dolphin-2.9.4-llama3.1-8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.4-llama3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.4-llama3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.4-llama3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cognitivecomputations/dolphin-2.9.4-llama3.1-8b
|
7b73d1b7760bf9abac168de3d388b30d1ca1a138
| 6.955628 |
llama3.1
| 92 | 8 | true | false | false | true | 1.756313 | 0.275724 | 27.572397 | 0.352363 | 8.972089 | 0.001511 | 0.151057 | 0.263423 | 1.789709 | 0.323615 | 0.61849 | 0.12367 | 2.630024 | false | true |
2024-08-04
|
2024-09-17
| 1 |
meta-llama/Meta-Llama-3.1-8B
|
collaiborateorg_Collaiborator-MEDLLM-Llama-3-8B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/collaiborateorg__Collaiborator-MEDLLM-Llama-3-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2
|
2560556d655d0ecaefec10f579c92292d65fb28b
| 17.951635 | 0 | 8 | false | false | false | false | 0.705789 | 0.380887 | 38.088716 | 0.464803 | 23.648503 | 0.057402 | 5.740181 | 0.333054 | 11.073826 | 0.343427 | 1.595052 | 0.348072 | 27.563534 | false | false |
2024-06-27
| 0 |
Removed
|
||
cpayne1303_cp2024_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cpayne1303/cp2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/cp2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__cp2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cpayne1303/cp2024
|
fb354aaa73c40b4f1fc6e86beea733e4f3929470
| 3.614016 |
apache-2.0
| 0 | 0 | true | false | false | false | 0.047613 | 0.165814 | 16.581448 | 0.298539 | 2.739141 | 0 | 0 | 0.255872 | 0.782998 | 0.338313 | 0.455729 | 0.110123 | 1.124778 | false | false |
2024-11-26
|
2024-11-26
| 0 |
cpayne1303/cp2024
|
cpayne1303_cp2024-instruct_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cpayne1303/cp2024-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/cp2024-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__cp2024-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cpayne1303/cp2024-instruct
|
ac4cfbc28479f8a94e3eb745526620be9b75edfa
| 4.319731 |
apache-2.0
| 0 | 0 | true | false | false | true | 0.032162 | 0.170611 | 17.061065 | 0.294678 | 2.4813 | 0 | 0 | 0.260067 | 1.342282 | 0.368635 | 3.179427 | 0.116689 | 1.854314 | false | false |
2024-11-27
|
2024-11-27
| 1 |
cpayne1303/cp2024
|
cpayne1303_llama-43m-beta_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cpayne1303/llama-43m-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/llama-43m-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__llama-43m-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cpayne1303/llama-43m-beta
|
1f85bec8c3541ed58fc2fcf4e6f98c1c34d72f60
| 5.288332 |
apache-2.0
| 0 | 0 | true | false | false | false | 0.058392 | 0.191568 | 19.156837 | 0.297678 | 2.482041 | 0 | 0 | 0.268456 | 2.46085 | 0.387177 | 6.163802 | 0.113198 | 1.46646 | false | false |
2024-11-30
|
2024-11-30
| 1 |
JackFram/llama-68m
|
cpayne1303_llama-43m-beta_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cpayne1303/llama-43m-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/llama-43m-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__llama-43m-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cpayne1303/llama-43m-beta
|
1f85bec8c3541ed58fc2fcf4e6f98c1c34d72f60
| 5.3471 |
apache-2.0
| 0 | 0 | true | false | false | false | 0.059916 | 0.194891 | 19.489067 | 0.296463 | 2.496048 | 0 | 0 | 0.268456 | 2.46085 | 0.388542 | 6.401042 | 0.11112 | 1.235594 | false | false |
2024-11-30
|
2024-12-04
| 1 |
JackFram/llama-68m
|
cpayne1303_smallcp2024_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cpayne1303/smallcp2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/smallcp2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__smallcp2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cpayne1303/smallcp2024
|
ef995127242553e4126190e7f70f927504834360
| 3.455732 |
apache-2.0
| 0 | 0 | true | false | false | false | 0.047308 | 0.158196 | 15.819581 | 0.302705 | 3.118178 | 0 | 0 | 0.230705 | 0 | 0.342469 | 0.533333 | 0.11137 | 1.263298 | false | false |
2024-11-27
|
2024-11-27
| 0 |
cpayne1303/smallcp2024
|
cstr_llama3.1-8b-spaetzle-v90_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cstr/llama3.1-8b-spaetzle-v90" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cstr/llama3.1-8b-spaetzle-v90</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cstr__llama3.1-8b-spaetzle-v90-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cstr/llama3.1-8b-spaetzle-v90
|
717e5c3d31ed2465cd7cf927327adf677a9420b5
| 27.830191 |
llama3
| 0 | 8 | true | false | false | true | 0.778908 | 0.735619 | 73.561927 | 0.530286 | 32.763666 | 0.148036 | 14.803625 | 0.282718 | 4.362416 | 0.413437 | 11.146354 | 0.373088 | 30.343159 | true | false |
2024-09-15
|
2024-09-15
| 1 |
cstr/llama3.1-8b-spaetzle-v90 (Merge)
|
cyberagent_calm3-22b-chat_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/cyberagent/calm3-22b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cyberagent/calm3-22b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cyberagent__calm3-22b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
cyberagent/calm3-22b-chat
|
055922aa0f0fb1fbfbc97a2e31134532485ee99b
| 21.37559 |
apache-2.0
| 70 | 22 | true | false | false | true | 1.774248 | 0.509131 | 50.913133 | 0.499168 | 29.520884 | 0.064955 | 6.495468 | 0.276846 | 3.579418 | 0.455323 | 16.082031 | 0.294963 | 21.662603 | false | false |
2024-07-01
|
2024-07-04
| 0 |
cyberagent/calm3-22b-chat
|
darkc0de_BuddyGlassNeverSleeps_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/darkc0de/BuddyGlassNeverSleeps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">darkc0de/BuddyGlassNeverSleeps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/darkc0de__BuddyGlassNeverSleeps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
darkc0de/BuddyGlassNeverSleeps
|
f8849498f02c94b68ef0308a7bf6637264949a7d
| 19.845818 | 2 | 8 | false | false | false | false | 1.354149 | 0.423902 | 42.390191 | 0.497723 | 28.477953 | 0.064199 | 6.41994 | 0.294463 | 5.928412 | 0.399271 | 8.608854 | 0.345246 | 27.249557 | false | false |
2024-09-16
|
2024-09-16
| 1 |
darkc0de/BuddyGlassNeverSleeps (Merge)
|
|
darkc0de_BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/darkc0de__BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp
|
57367fefe01c7d9653c303b28449b416fc777d93
| 22.265315 | 1 | 0 | false | false | false | false | 0.898182 | 0.435842 | 43.584245 | 0.524309 | 31.869311 | 0.124622 | 12.462236 | 0.298658 | 6.487696 | 0.414333 | 9.491667 | 0.367271 | 29.696735 | false | false |
2024-09-10
|
2024-09-15
| 1 |
darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp (Merge)
|
|
databricks_dbrx-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
DbrxForCausalLM
|
<a target="_blank" href="https://huggingface.co/databricks/dbrx-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dbrx-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dbrx-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
databricks/dbrx-instruct
|
c0a9245908c187da8f43a81e538e67ff360904ea
| 25.19901 |
other
| 1,104 | 131 | true | false | false | true | 47.958027 | 0.54158 | 54.157968 | 0.542896 | 35.96382 | 0.068731 | 6.873112 | 0.341443 | 12.192394 | 0.426927 | 12.199219 | 0.368268 | 29.80755 | false | true |
2024-03-26
|
2024-06-12
| 0 |
databricks/dbrx-instruct
|
databricks_dolly-v1-6b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTJForCausalLM
|
<a target="_blank" href="https://huggingface.co/databricks/dolly-v1-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v1-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v1-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
databricks/dolly-v1-6b
|
c9a85b3a322b402e20c839c702c725afe0cb454d
| 6.918291 |
cc-by-nc-4.0
| 310 | 6 | true | false | false | false | 0.66078 | 0.222443 | 22.244312 | 0.317209 | 4.781309 | 0.015106 | 1.510574 | 0.264262 | 1.901566 | 0.400417 | 8.11875 | 0.126579 | 2.953236 | false | true |
2023-03-23
|
2024-06-12
| 0 |
databricks/dolly-v1-6b
|
databricks_dolly-v2-12b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/databricks/dolly-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
databricks/dolly-v2-12b
|
19308160448536e378e3db21a73a751579ee7fdd
| 6.383024 |
mit
| 1,950 | 12 | true | false | false | false | 1.397119 | 0.235507 | 23.550734 | 0.331997 | 6.377894 | 0.01435 | 1.435045 | 0.240772 | 0 | 0.373906 | 5.504948 | 0.112866 | 1.429521 | false | true |
2023-04-11
|
2024-06-12
| 0 |
databricks/dolly-v2-12b
|
databricks_dolly-v2-3b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/databricks/dolly-v2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
databricks/dolly-v2-3b
|
f6c9be08f16fe4d3a719bee0a4a7c7415b5c65df
| 5.461189 |
mit
| 287 | 3 | true | false | false | false | 0.758084 | 0.224716 | 22.471598 | 0.307928 | 3.324769 | 0.006798 | 0.679758 | 0.260906 | 1.454139 | 0.333781 | 3.222656 | 0.114528 | 1.614214 | false | true |
2023-04-13
|
2024-06-12
| 0 |
databricks/dolly-v2-3b
|
databricks_dolly-v2-7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/databricks/dolly-v2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
databricks/dolly-v2-7b
|
d632f0c8b75b1ae5b26b250d25bfba4e99cb7c6f
| 5.571832 |
mit
| 148 | 7 | true | false | false | false | 0.830206 | 0.200986 | 20.098561 | 0.317306 | 5.449893 | 0.009819 | 0.981873 | 0.268456 | 2.46085 | 0.355302 | 2.779427 | 0.114943 | 1.660387 | false | true |
2023-04-13
|
2024-06-12
| 0 |
databricks/dolly-v2-7b
|
davidkim205_Rhea-72b-v0.5_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/davidkim205/Rhea-72b-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">davidkim205/Rhea-72b-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/davidkim205__Rhea-72b-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
davidkim205/Rhea-72b-v0.5
|
bc3806efb23d2713e6630a748d9747fd76b27169
| 4.224031 |
apache-2.0
| 134 | 72 | true | false | false | false | 8.688691 | 0.014538 | 1.453809 | 0.307834 | 3.670747 | 0.067221 | 6.722054 | 0.252517 | 0.33557 | 0.424135 | 11.316927 | 0.116606 | 1.84508 | false | false |
2024-03-22
|
2024-09-15
| 0 |
davidkim205/Rhea-72b-v0.5
|
davidkim205_nox-solar-10.7b-v4_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/davidkim205/nox-solar-10.7b-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">davidkim205/nox-solar-10.7b-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/davidkim205__nox-solar-10.7b-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
davidkim205/nox-solar-10.7b-v4
|
5f4be6cb7d8398b84689148d15f3838f2e01e104
| 18.489145 |
apache-2.0
| 11 | 10 | true | false | false | true | 0.848976 | 0.375342 | 37.534187 | 0.481404 | 26.631088 | 0.006798 | 0.679758 | 0.307047 | 7.606264 | 0.429844 | 12.563802 | 0.333278 | 25.91977 | false | false |
2024-03-16
|
2024-10-04
| 0 |
davidkim205/nox-solar-10.7b-v4
|
deepseek-ai_deepseek-llm-67b-chat_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-llm-67b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-llm-67b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
deepseek-ai/deepseek-llm-67b-chat
|
79648bef7658bb824e4630740f6e1484c1b0620b
| 26.995929 |
other
| 177 | 67 | true | false | false | true | 59.821809 | 0.558715 | 55.871532 | 0.524342 | 33.225242 | 0.074018 | 7.401813 | 0.316275 | 8.836689 | 0.505865 | 23.933073 | 0.394365 | 32.707225 | false | true |
2023-11-29
|
2024-06-12
| 0 |
deepseek-ai/deepseek-llm-67b-chat
|
deepseek-ai_deepseek-llm-7b-base_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-llm-7b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-llm-7b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-llm-7b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
deepseek-ai/deepseek-llm-7b-base
|
7683fea62db869066ddaff6a41d032262c490d4f
| 8.138982 |
other
| 36 | 7 | true | false | false | false | 0.822536 | 0.217872 | 21.787191 | 0.350303 | 9.767925 | 0.01435 | 1.435045 | 0.27349 | 3.131991 | 0.373781 | 3.75599 | 0.180602 | 8.955748 | false | true |
2023-11-29
|
2024-06-12
| 0 |
deepseek-ai/deepseek-llm-7b-base
|
deepseek-ai_deepseek-llm-7b-chat_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-llm-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-llm-7b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-llm-7b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
deepseek-ai/deepseek-llm-7b-chat
|
afbda8b347ec881666061fa67447046fc5164ec8
| 14.785393 |
other
| 76 | 7 | true | false | false | true | 0.774483 | 0.417082 | 41.708223 | 0.363208 | 11.258949 | 0.018127 | 1.812689 | 0.26594 | 2.12528 | 0.466771 | 19.213021 | 0.213348 | 12.594193 | false | true |
2023-11-29
|
2024-06-12
| 0 |
deepseek-ai/deepseek-llm-7b-chat
|
deepseek-ai_deepseek-moe-16b-base_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
DeepseekForCausalLM
|
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-moe-16b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-moe-16b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-moe-16b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
deepseek-ai/deepseek-moe-16b-base
|
521d2bc4fb69a3f3ae565310fcc3b65f97af2580
| 7.390805 |
other
| 84 | 16 | true | true | false | false | 7.002465 | 0.244974 | 24.497445 | 0.340946 | 8.355556 | 0.019637 | 1.963746 | 0.254195 | 0.559284 | 0.365781 | 3.35599 | 0.150515 | 5.61281 | false | true |
2024-01-08
|
2024-06-12
| 0 |
deepseek-ai/deepseek-moe-16b-base
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.