eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
meta-llama_Meta-Llama-3.1-8B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
meta-llama/Meta-Llama-3.1-8B-Instruct
|
df34336b42332c6d360959e259cd6271c6a09fd4
| 28.204458 |
llama3.1
| 3,233 | 8 | true | false | false | true | 2.487012 | 0.785578 | 78.557782 | 0.507327 | 29.892756 | 0.193353 | 19.335347 | 0.267617 | 2.348993 | 0.38699 | 8.407031 | 0.376164 | 30.68484 | false | true |
2024-07-18
|
2024-08-15
| 1 |
meta-llama/Meta-Llama-3.1-8B
|
microsoft_DialoGPT-medium_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/microsoft/DialoGPT-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/DialoGPT-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__DialoGPT-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/DialoGPT-medium
|
7b40bb0f92c45fefa957d088000d8648e5c7fa33
| 5.251434 |
mit
| 331 | 0 | true | false | false | true | 0.129464 | 0.147904 | 14.790423 | 0.301416 | 2.556856 | 0 | 0 | 0.254195 | 0.559284 | 0.428667 | 12.283333 | 0.111868 | 1.318706 | false | true |
2022-03-02
|
2024-06-13
| 0 |
microsoft/DialoGPT-medium
|
microsoft_Orca-2-13b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Orca-2-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Orca-2-13b
|
2539ff53e6baa4cc603774ad5a2d646f4041ea4e
| 18.149404 |
other
| 665 | 13 | true | false | false | false | 1.008582 | 0.312793 | 31.279339 | 0.488449 | 27.308019 | 0.010574 | 1.057402 | 0.280201 | 4.026846 | 0.512969 | 25.78776 | 0.274934 | 19.437057 | false | true |
2023-11-14
|
2024-06-12
| 0 |
microsoft/Orca-2-13b
|
microsoft_Orca-2-7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Orca-2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Orca-2-7b
|
60e31e6bdcf582ad103b807cb74b73ee1d2c4b17
| 14.216008 |
other
| 216 | 7 | true | false | false | false | 1.209312 | 0.218346 | 21.834621 | 0.445213 | 22.429468 | 0.008308 | 0.830816 | 0.260906 | 1.454139 | 0.502615 | 24.09349 | 0.231882 | 14.653517 | false | true |
2023-11-14
|
2024-06-12
| 0 |
microsoft/Orca-2-7b
|
microsoft_Phi-3-medium-128k-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-medium-128k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-medium-128k-instruct
|
fa7d2aa4f5ea69b2e36b20d050cdae79c9bfbb3f
| 31.711653 |
mit
| 370 | 13 | true | false | false | true | 1.947559 | 0.604003 | 60.400293 | 0.638232 | 48.460451 | 0.172961 | 17.296073 | 0.336409 | 11.521253 | 0.412948 | 11.351823 | 0.47116 | 41.240027 | false | true |
2024-05-07
|
2024-08-21
| 0 |
microsoft/Phi-3-medium-128k-instruct
|
microsoft_Phi-3-medium-4k-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-medium-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-medium-4k-instruct
|
d194e4e74ffad5a5e193e26af25bcfc80c7f1ffc
| 32.89625 |
mit
| 211 | 13 | true | false | false | true | 1.455263 | 0.642271 | 64.22714 | 0.641246 | 49.38061 | 0.183535 | 18.353474 | 0.336409 | 11.521253 | 0.42575 | 13.052083 | 0.467586 | 40.842937 | false | true |
2024-05-07
|
2024-06-12
| 0 |
microsoft/Phi-3-medium-4k-instruct
|
microsoft_Phi-3-mini-128k-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-mini-128k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-mini-128k-instruct
|
5be6479b4bc06a081e8f4c6ece294241ccd32dec
| 25.626287 |
mit
| 1,608 | 3 | true | false | false | true | 24.222252 | 0.597633 | 59.763317 | 0.557453 | 37.099767 | 0.097432 | 9.743202 | 0.317953 | 9.060403 | 0.393688 | 7.710938 | 0.373421 | 30.380098 | false | true |
2024-04-22
|
2024-08-21
| 0 |
microsoft/Phi-3-mini-128k-instruct
|
microsoft_Phi-3-mini-4k-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-mini-4k-instruct
|
ff07dc01615f8113924aed013115ab2abd32115b
| 25.967733 |
mit
| 1,086 | 3 | true | false | false | true | 0.804075 | 0.561288 | 56.128849 | 0.567597 | 39.269335 | 0.116314 | 11.63142 | 0.319631 | 9.284116 | 0.395021 | 7.644271 | 0.386636 | 31.848404 | false | true |
2024-04-22
|
2024-06-12
| 0 |
microsoft/Phi-3-mini-4k-instruct
|
microsoft_Phi-3-mini-4k-instruct_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-mini-4k-instruct
|
c1358f8a35e6d2af81890deffbbfa575b978c62f
| 27.411117 |
mit
| 1,086 | 3 | true | false | false | true | 0.786699 | 0.547675 | 54.767461 | 0.549072 | 36.559855 | 0.154834 | 15.483384 | 0.332215 | 10.961969 | 0.428417 | 13.11875 | 0.402178 | 33.575281 | false | true |
2024-04-22
|
2024-07-02
| 0 |
microsoft/Phi-3-mini-4k-instruct
|
microsoft_Phi-3-small-128k-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3SmallForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-small-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-small-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-small-128k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-small-128k-instruct
|
f80aaa30bfc64c2b8ab214b541d9050e97163bc4
| 28.590992 |
mit
| 172 | 7 | true | false | false | true | 2.508468 | 0.636826 | 63.682584 | 0.620218 | 45.63407 | 0 | 0 | 0.317114 | 8.948546 | 0.437844 | 14.497135 | 0.449053 | 38.783614 | false | true |
2024-05-07
|
2024-06-13
| 0 |
microsoft/Phi-3-small-128k-instruct
|
microsoft_Phi-3-small-8k-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3SmallForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-small-8k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-small-8k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-small-8k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3-small-8k-instruct
|
1535ae26fb4faada95c6950e8bc6e867cdad6b00
| 29.670922 |
mit
| 159 | 7 | true | false | false | true | 1.025454 | 0.649665 | 64.966511 | 0.620836 | 46.20557 | 0.02843 | 2.843014 | 0.312081 | 8.277405 | 0.455792 | 16.773958 | 0.450632 | 38.959072 | false | true |
2024-05-07
|
2024-06-13
| 0 |
microsoft/Phi-3-small-8k-instruct
|
microsoft_Phi-3.5-MoE-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-MoE-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3.5-MoE-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3.5-MoE-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3.5-MoE-instruct
|
482a9ba0eb0e1fa1671e3560e009d7cec2e5147c
| 35.456508 |
mit
| 531 | 42 | true | true | false | true | 4.632279 | 0.692455 | 69.245491 | 0.640763 | 48.774646 | 0.226586 | 22.65861 | 0.355705 | 14.09396 | 0.456479 | 17.326562 | 0.465758 | 40.639775 | false | true |
2024-08-17
|
2024-08-21
| 0 |
microsoft/Phi-3.5-MoE-instruct
|
microsoft_Phi-3.5-mini-instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3.5-mini-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3.5-mini-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/Phi-3.5-mini-instruct
|
64963004ad95869fa73a30279371c8778509ac84
| 27.567573 |
mit
| 675 | 3 | true | false | false | true | 3.696004 | 0.57745 | 57.745005 | 0.551779 | 36.745854 | 0.159366 | 15.936556 | 0.339765 | 11.96868 | 0.402125 | 10.098958 | 0.396193 | 32.910387 | false | true |
2024-08-16
|
2024-08-21
| 0 |
microsoft/Phi-3.5-mini-instruct
|
microsoft_phi-1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/phi-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/phi-1
|
b9ac0e6d78d43970ecf88e9e0154b3a7da20ed89
| 5.523966 |
mit
| 206 | 1 | true | false | false | false | 0.286229 | 0.206806 | 20.680572 | 0.313948 | 4.273999 | 0.006798 | 0.679758 | 0.265101 | 2.013423 | 0.35251 | 3.697135 | 0.11619 | 1.798907 | false | true |
2023-09-10
|
2024-06-13
| 0 |
microsoft/phi-1
|
microsoft_phi-1_5_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/phi-1_5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-1_5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-1_5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/phi-1_5
|
675aa382d814580b22651a30acb1a585d7c25963
| 7.057674 |
mit
| 1,317 | 1 | true | false | false | false | 0.340862 | 0.203284 | 20.328395 | 0.335976 | 7.468939 | 0.011329 | 1.132931 | 0.267617 | 2.348993 | 0.340417 | 3.385417 | 0.169132 | 7.681368 | false | true |
2023-09-10
|
2024-06-09
| 0 |
microsoft/phi-1_5
|
microsoft_phi-2_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/microsoft/phi-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
microsoft/phi-2
|
ef382358ec9e382308935a992d908de099b64c23
| 15.471351 |
mit
| 3,249 | 2 | true | false | false | false | 0.423521 | 0.273876 | 27.387554 | 0.488121 | 28.038519 | 0.02568 | 2.567976 | 0.271812 | 2.908277 | 0.409896 | 13.836979 | 0.262799 | 18.0888 | false | true |
2023-12-13
|
2024-06-09
| 0 |
microsoft/phi-2
|
migtissera_Llama-3-70B-Synthia-v3.5_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Llama-3-70B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Llama-3-70B-Synthia-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Llama-3-70B-Synthia-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Llama-3-70B-Synthia-v3.5
|
8744db0bccfc18f1847633da9d29fc89b35b4190
| 35.204299 |
llama3
| 5 | 70 | true | false | false | true | 8.769698 | 0.60765 | 60.764992 | 0.648864 | 49.11816 | 0.189577 | 18.957704 | 0.387584 | 18.344519 | 0.492198 | 23.391406 | 0.465841 | 40.64901 | false | false |
2024-05-26
|
2024-08-28
| 0 |
migtissera/Llama-3-70B-Synthia-v3.5
|
migtissera_Llama-3-8B-Synthia-v3.5_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Llama-3-8B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Llama-3-8B-Synthia-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Llama-3-8B-Synthia-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Llama-3-8B-Synthia-v3.5
|
af4990801a24fee7acf16370cb5aa5643b5e9d6c
| 19.696678 |
llama3
| 15 | 8 | true | false | false | true | 0.828698 | 0.506958 | 50.69582 | 0.488794 | 27.542339 | 0.050604 | 5.060423 | 0.271812 | 2.908277 | 0.404385 | 9.414844 | 0.303025 | 22.558363 | false | false |
2024-05-17
|
2024-08-28
| 0 |
migtissera/Llama-3-8B-Synthia-v3.5
|
migtissera_Tess-3-7B-SFT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Tess-3-7B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-3-7B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-3-7B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Tess-3-7B-SFT
|
404de3b56564dbd43cd64d97f8574b43189462f3
| 17.096163 |
apache-2.0
| 3 | 7 | true | false | false | true | 0.64717 | 0.394626 | 39.462626 | 0.460735 | 24.123847 | 0.033233 | 3.323263 | 0.270973 | 2.796421 | 0.411271 | 10.275521 | 0.303358 | 22.595301 | false | false |
2024-07-09
|
2024-07-20
| 1 |
mistralai/Mistral-7B-v0.3
|
migtissera_Tess-3-Mistral-Nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Tess-3-Mistral-Nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-3-Mistral-Nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-3-Mistral-Nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Tess-3-Mistral-Nemo-12B
|
0b82dea6e8f4aed4a1c2e10198d68991c30d171b
| 16.543939 |
apache-2.0
| 12 | 12 | true | false | false | true | 1.889992 | 0.3355 | 33.549981 | 0.489942 | 28.042728 | 0.046828 | 4.682779 | 0.250839 | 0.111857 | 0.445781 | 15.489323 | 0.256483 | 17.386968 | false | false |
2024-08-13
|
2024-09-16
| 0 |
migtissera/Tess-3-Mistral-Nemo-12B
|
migtissera_Tess-v2.5-Phi-3-medium-128k-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Tess-v2.5-Phi-3-medium-128k-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-v2.5-Phi-3-medium-128k-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-v2.5-Phi-3-medium-128k-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Tess-v2.5-Phi-3-medium-128k-14B
|
3a4dbce32e765f659d418c57f0040d290b8b480d
| 23.738382 |
mit
| 3 | 13 | true | false | false | true | 2.23817 | 0.453877 | 45.387682 | 0.620661 | 46.215828 | 0.026435 | 2.643505 | 0.307886 | 7.718121 | 0.411302 | 10.11276 | 0.373172 | 30.352394 | false | false |
2024-06-05
|
2024-08-30
| 1 |
microsoft/Phi-3-medium-128k-instruct
|
migtissera_Tess-v2.5.2-Qwen2-72B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Tess-v2.5.2-Qwen2-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-v2.5.2-Qwen2-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-v2.5.2-Qwen2-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Tess-v2.5.2-Qwen2-72B
|
0435e634ad9bc8b1172395a535b78e6f25f3594f
| 33.276047 |
other
| 11 | 72 | true | false | false | true | 14.613087 | 0.449431 | 44.943084 | 0.664679 | 52.308136 | 0.274169 | 27.416918 | 0.350671 | 13.422819 | 0.418833 | 10.8875 | 0.5561 | 50.677822 | false | false |
2024-06-13
|
2024-08-10
| 0 |
migtissera/Tess-v2.5.2-Qwen2-72B
|
migtissera_Trinity-2-Codestral-22B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Trinity-2-Codestral-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Trinity-2-Codestral-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Trinity-2-Codestral-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Trinity-2-Codestral-22B
|
5f20b9d8af1a75c135c70bd7295e58301cce63fc
| 21.819011 |
other
| 11 | 22 | true | false | false | true | 1.502157 | 0.420205 | 42.020506 | 0.559324 | 36.412738 | 0.086103 | 8.610272 | 0.314597 | 8.612975 | 0.411052 | 9.614844 | 0.330785 | 25.64273 | false | false |
2024-08-07
|
2024-09-16
| 1 |
mistralai/Codestral-22B-v0.1
|
migtissera_Trinity-2-Codestral-22B-v0.2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Trinity-2-Codestral-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Trinity-2-Codestral-22B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Trinity-2-Codestral-22B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Trinity-2-Codestral-22B-v0.2
|
63513c3eb9b7c552fc163f58a2e7dc1fa09573b5
| 21.869825 |
other
| 6 | 22 | true | false | false | true | 1.553522 | 0.434468 | 43.446832 | 0.568636 | 37.614246 | 0.083837 | 8.383686 | 0.300336 | 6.711409 | 0.404479 | 9.059896 | 0.334026 | 26.002881 | false | false |
2024-08-13
|
2024-08-28
| 1 |
mistralai/Codestral-22B-v0.1
|
migtissera_Trinity-2-Codestral-22B-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/migtissera/Trinity-2-Codestral-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Trinity-2-Codestral-22B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Trinity-2-Codestral-22B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
migtissera/Trinity-2-Codestral-22B-v0.2
|
9452a82ac7bfa9092a061ec913e9078ef3525a03
| 22.1118 |
other
| 6 | 22 | true | false | false | true | 1.561208 | 0.443011 | 44.301121 | 0.570647 | 37.786041 | 0.07855 | 7.854985 | 0.307886 | 7.718121 | 0.403146 | 8.859896 | 0.335356 | 26.150635 | false | false |
2024-08-13
|
2024-09-16
| 1 |
mistralai/Codestral-22B-v0.1
|
minghaowu_Qwen1.5-1.8B-OpenHermes-2.5_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/minghaowu/Qwen1.5-1.8B-OpenHermes-2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">minghaowu/Qwen1.5-1.8B-OpenHermes-2.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/minghaowu__Qwen1.5-1.8B-OpenHermes-2.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
minghaowu/Qwen1.5-1.8B-OpenHermes-2.5
|
40700de82968350c192318877fe522630d0ef76d
| 8.319696 | 0 | 1 | false | false | false | true | 1.0949 | 0.277797 | 27.779736 | 0.337464 | 7.561478 | 0.002266 | 0.226586 | 0.283557 | 4.474273 | 0.352885 | 1.077344 | 0.179189 | 8.798759 | false | false |
2024-09-12
| 0 |
Removed
|
||
ministral_Ministral-3b-instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/ministral/Ministral-3b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ministral/Ministral-3b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ministral__Ministral-3b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ministral/Ministral-3b-instruct
|
2c95908929198d6e69af8638f0dbbd9bc6b93f9e
| 3.381613 |
apache-2.0
| 33 | 3 | true | false | false | false | 0.264487 | 0.135764 | 13.576422 | 0.319186 | 4.675864 | 0 | 0 | 0.251678 | 0.223714 | 0.33825 | 0.78125 | 0.109292 | 1.032432 | false | false |
2024-03-14
|
2024-10-25
| 0 |
ministral/Ministral-3b-instruct
|
mistral-community_Mistral-7B-v0.2_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistral-community/Mistral-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistral-community/Mistral-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistral-community__Mistral-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistral-community/Mistral-7B-v0.2
|
2c3e624962b1a3f3fbf52e15969565caa7bc064a
| 14.215362 |
apache-2.0
| 232 | 7 | true | false | false | false | 0.553213 | 0.22664 | 22.663976 | 0.451019 | 23.950865 | 0.030211 | 3.021148 | 0.291946 | 5.592841 | 0.403177 | 8.363802 | 0.295296 | 21.699542 | false | true |
2024-03-23
|
2024-06-12
| 0 |
mistral-community/Mistral-7B-v0.2
|
mistral-community_mixtral-8x22B-v0.3_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistral-community/mixtral-8x22B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistral-community/mixtral-8x22B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistral-community__mixtral-8x22B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistral-community/mixtral-8x22B-v0.3
|
211b177b79ab5ef245ee334d106c27623e786882
| 25.789407 |
apache-2.0
| 3 | 140 | true | true | false | false | 52.494485 | 0.258264 | 25.826363 | 0.625 | 45.731041 | 0.182779 | 18.277946 | 0.377517 | 17.002237 | 0.403698 | 7.46224 | 0.46393 | 40.436613 | false | true |
2024-05-25
|
2024-06-13
| 0 |
mistral-community/mixtral-8x22B-v0.3
|
mistralai_Codestral-22B-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Codestral-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Codestral-22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Codestral-22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Codestral-22B-v0.1
|
8f5fe23af91885222a1563283c87416745a5e212
| 23.279917 |
other
| 1,156 | 22 | true | false | false | true | 1.30667 | 0.577175 | 57.717523 | 0.513914 | 30.737634 | 0.100453 | 10.045317 | 0.298658 | 6.487696 | 0.418708 | 10.738542 | 0.315575 | 23.952793 | false | true |
2024-05-29
|
2024-09-28
| 0 |
mistralai/Codestral-22B-v0.1
|
mistralai_Ministral-8B-Instruct-2410_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Ministral-8B-Instruct-2410" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Ministral-8B-Instruct-2410</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Ministral-8B-Instruct-2410-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Ministral-8B-Instruct-2410
|
199e57c1d66379760f6413f79d27008d1d1dbd6e
| 22.007859 |
other
| 358 | 8 | true | false | false | true | 0.797086 | 0.58964 | 58.963993 | 0.476164 | 25.824774 | 0.064955 | 6.495468 | 0.284396 | 4.58613 | 0.41375 | 10.71875 | 0.329122 | 25.458038 | false | true |
2024-10-15
|
2024-12-01
| 0 |
mistralai/Ministral-8B-Instruct-2410
|
mistralai_Mistral-7B-Instruct-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-7B-Instruct-v0.1
|
73068f3702d050a2fd5aa2ca1e612e5036429398
| 12.695701 |
apache-2.0
| 1,534 | 7 | true | false | false | true | 1.216045 | 0.448706 | 44.87061 | 0.335481 | 7.647021 | 0.018127 | 1.812689 | 0.25 | 0 | 0.38476 | 6.128385 | 0.241439 | 15.715499 | false | true |
2023-09-27
|
2024-06-27
| 1 |
mistralai/Mistral-7B-v0.1
|
mistralai_Mistral-7B-Instruct-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-Instruct-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-Instruct-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-7B-Instruct-v0.2
|
41b61a33a2483885c981aa79e0df6b32407ed873
| 18.457539 |
apache-2.0
| 2,586 | 7 | true | false | false | true | 0.534407 | 0.549623 | 54.962278 | 0.445974 | 22.910602 | 0.02719 | 2.719033 | 0.276007 | 3.467562 | 0.396604 | 7.608854 | 0.271692 | 19.076906 | false | true |
2023-12-11
|
2024-06-12
| 0 |
mistralai/Mistral-7B-Instruct-v0.2
|
mistralai_Mistral-7B-Instruct-v0.3_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-Instruct-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-Instruct-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-7B-Instruct-v0.3
|
83e9aa141f2e28c82232fea5325f54edf17c43de
| 19.174746 |
apache-2.0
| 1,175 | 7 | true | false | false | true | 0.537783 | 0.546525 | 54.652544 | 0.472196 | 25.569115 | 0.035498 | 3.549849 | 0.279362 | 3.914989 | 0.373906 | 4.304948 | 0.307513 | 23.057033 | false | true |
2024-05-22
|
2024-06-12
| 1 |
mistralai/Mistral-7B-v0.3
|
mistralai_Mistral-7B-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-7B-v0.1
|
26bca36bde8333b5d7f72e9ed20ccda6a618af24
| 14.562619 |
apache-2.0
| 3,480 | 7 | true | false | false | false | 0.675534 | 0.238555 | 23.855481 | 0.443107 | 22.168402 | 0.02719 | 2.719033 | 0.291946 | 5.592841 | 0.413938 | 10.675521 | 0.30128 | 22.364436 | false | true |
2023-09-20
|
2024-06-12
| 0 |
mistralai/Mistral-7B-v0.1
|
mistralai_Mistral-7B-v0.3_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-7B-v0.3
|
b67d6a03ca097c5122fa65904fce0413500bf8c8
| 14.215362 |
apache-2.0
| 400 | 7 | true | false | false | false | 0.660476 | 0.22664 | 22.663976 | 0.451019 | 23.950865 | 0.030211 | 3.021148 | 0.291946 | 5.592841 | 0.403177 | 8.363802 | 0.295296 | 21.699542 | false | true |
2024-05-22
|
2024-06-12
| 0 |
mistralai/Mistral-7B-v0.3
|
mistralai_Mistral-Large-Instruct-2411_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Large-Instruct-2411</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Large-Instruct-2411-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-Large-Instruct-2411
|
3a5cb136f6106edf5c1210369068eb5a4f787cab
| 38.455231 |
other
| 163 | 122 | true | false | false | true | 26.272305 | 0.840058 | 84.005771 | 0.674665 | 52.744892 | 0.011329 | 1.132931 | 0.437081 | 24.944072 | 0.454 | 17.216667 | 0.556184 | 50.687057 | false | true |
2024-11-14
|
2024-11-19
| 0 |
mistralai/Mistral-Large-Instruct-2411
|
mistralai_Mistral-Nemo-Base-2407_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Nemo-Base-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Nemo-Base-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Nemo-Base-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-Nemo-Base-2407
|
d2efb15544d5401f761235bef327babb850887d0
| 15.138651 |
apache-2.0
| 263 | 11 | true | false | false | false | 1.702995 | 0.162992 | 16.299197 | 0.503506 | 29.374736 | 0.053625 | 5.362538 | 0.293624 | 5.816555 | 0.392135 | 6.516927 | 0.347158 | 27.461953 | false | true |
2024-07-18
|
2024-07-19
| 0 |
mistralai/Mistral-Nemo-Base-2407
|
mistralai_Mistral-Nemo-Instruct-2407_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Nemo-Instruct-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Nemo-Instruct-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-Nemo-Instruct-2407
|
4d14c1db68fe20dbf80b8eca85d39b909c5fe1d5
| 23.633374 |
apache-2.0
| 1,285 | 12 | true | false | false | true | 2.997601 | 0.638025 | 63.802489 | 0.503652 | 29.67997 | 0.064955 | 6.495468 | 0.290268 | 5.369128 | 0.39 | 8.483333 | 0.351729 | 27.969858 | false | true |
2024-07-17
|
2024-08-29
| 1 |
mistralai/Mistral-Nemo-Base-2407
|
mistralai_Mistral-Small-Instruct-2409_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-Instruct-2409" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-Instruct-2409</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Small-Instruct-2409-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-Small-Instruct-2409
|
63e53df6575e7085d62113f4383835ff979b3795
| 26.262749 |
other
| 356 | 22 | true | false | false | true | 1.379338 | 0.666976 | 66.697585 | 0.521308 | 30.792096 | 0.143505 | 14.350453 | 0.323826 | 9.8434 | 0.363208 | 3.001042 | 0.396027 | 32.891918 | false | true |
2024-09-17
|
2024-09-19
| 0 |
mistralai/Mistral-Small-Instruct-2409
|
mistralai_Mistral-Small-Instruct-2409_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-Instruct-2409" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-Instruct-2409</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Small-Instruct-2409-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mistral-Small-Instruct-2409
|
63e53df6575e7085d62113f4383835ff979b3795
| 29.818243 |
other
| 356 | 22 | true | false | false | false | 1.610007 | 0.628283 | 62.828296 | 0.583028 | 40.559713 | 0.197885 | 19.78852 | 0.333054 | 11.073826 | 0.406333 | 10.225 | 0.409907 | 34.434102 | false | true |
2024-09-17
|
2024-09-25
| 0 |
mistralai/Mistral-Small-Instruct-2409
|
mistralai_Mixtral-8x22B-Instruct-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x22B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x22B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mixtral-8x22B-Instruct-v0.1
|
b0c3516041d014f640267b14feb4e9a84c8e8c71
| 33.88568 |
apache-2.0
| 693 | 140 | true | true | false | true | 47.147579 | 0.718358 | 71.83584 | 0.612492 | 44.114346 | 0.187311 | 18.731118 | 0.373322 | 16.442953 | 0.431115 | 13.489323 | 0.448305 | 38.700502 | false | true |
2024-04-16
|
2024-06-12
| 1 |
mistralai/Mixtral-8x22B-v0.1
|
mistralai_Mixtral-8x22B-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mixtral-8x22B-v0.1
|
b03e260818710044a2f088d88fab12bb220884fb
| 25.728348 |
apache-2.0
| 201 | 140 | true | true | false | false | 104.697316 | 0.258264 | 25.826363 | 0.623981 | 45.588404 | 0.182779 | 18.277946 | 0.375839 | 16.778523 | 0.403698 | 7.46224 | 0.46393 | 40.436613 | false | true |
2024-04-16
|
2024-06-12
| 0 |
mistralai/Mixtral-8x22B-v0.1
|
mistralai_Mixtral-8x7B-Instruct-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mixtral-8x7B-Instruct-v0.1
|
1e637f2d7cb0a9d6fb1922f305cb784995190a83
| 23.842279 |
apache-2.0
| 4,218 | 46 | true | true | false | true | 13.764939 | 0.559914 | 55.991436 | 0.496237 | 29.742398 | 0.0929 | 9.29003 | 0.302852 | 7.04698 | 0.420323 | 11.073698 | 0.369182 | 29.909131 | false | true |
2023-12-10
|
2024-06-12
| 1 |
mistralai/Mixtral-8x7B-v0.1
|
mistralai_Mixtral-8x7B-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mixtral-8x7B-v0.1
|
985aa055896a8f943d4a9f2572e6ea1341823841
| 19.451988 |
apache-2.0
| 1,651 | 46 | true | true | false | false | 18.387865 | 0.241527 | 24.152693 | 0.508667 | 30.294195 | 0.095166 | 9.516616 | 0.313758 | 8.501119 | 0.432135 | 12.583594 | 0.384973 | 31.663712 | false | true |
2023-12-01
|
2024-08-20
| 0 |
mistralai/Mixtral-8x7B-v0.1
|
mistralai_Mixtral-8x7B-v0.1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mistralai/Mixtral-8x7B-v0.1
|
985aa055896a8f943d4a9f2572e6ea1341823841
| 19.665109 |
apache-2.0
| 1,651 | 46 | true | true | false | false | 5.1351 | 0.232609 | 23.260948 | 0.509771 | 30.400299 | 0.093656 | 9.365559 | 0.32047 | 9.395973 | 0.441313 | 13.664063 | 0.387134 | 31.903812 | false | true |
2023-12-01
|
2024-06-27
| 0 |
mistralai/Mixtral-8x7B-v0.1
|
mixtao_MixTAO-7Bx2-MoE-v8.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mixtao/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mixtao/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mixtao__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mixtao/MixTAO-7Bx2-MoE-v8.1
|
339130b87b6ef2484fea9fbfacba8a714ac03347
| 21.077927 |
apache-2.0
| 55 | 12 | true | true | false | false | 0.924035 | 0.416233 | 41.623337 | 0.518906 | 32.310342 | 0.090634 | 9.063444 | 0.284396 | 4.58613 | 0.446333 | 15.291667 | 0.312334 | 23.592642 | false | false |
2024-02-26
|
2024-10-04
| 0 |
mixtao/MixTAO-7Bx2-MoE-v8.1
|
mkxu_llama-3-8b-po1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mkxu/llama-3-8b-po1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mkxu/llama-3-8b-po1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mkxu__llama-3-8b-po1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mkxu/llama-3-8b-po1
|
1b16e10de696c43cd2b49fac9f6195dc551438ee
| 19.767002 | 0 | 8 | false | false | false | false | 0.512188 | 0.408115 | 40.811491 | 0.497609 | 29.181759 | 0.070242 | 7.024169 | 0.29698 | 6.263982 | 0.380417 | 6.852083 | 0.356217 | 28.468528 | false | false |
2024-11-29
|
2024-11-29
| 0 |
mkxu/llama-3-8b-po1
|
|
mlabonne_AlphaMonarch-7B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/AlphaMonarch-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/AlphaMonarch-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__AlphaMonarch-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/AlphaMonarch-7B
|
3de065d84411d74e5b3590f67f52b0b71faf6161
| 17.655797 |
cc-by-nc-4.0
| 148 | 7 | true | false | false | true | 0.572572 | 0.493944 | 49.394385 | 0.462552 | 23.947378 | 0.042296 | 4.229607 | 0.270134 | 2.684564 | 0.412135 | 9.316927 | 0.247257 | 16.361924 | true | true |
2024-02-14
|
2024-06-12
| 1 |
mlabonne/AlphaMonarch-7B (Merge)
|
mlabonne_Beyonder-4x7B-v3_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Beyonder-4x7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Beyonder-4x7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Beyonder-4x7B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Beyonder-4x7B-v3
|
8e923fa480f511ab54d79b44b0487768bdd3de4e
| 19.381682 |
cc-by-nc-4.0
| 58 | 24 | true | true | false | true | 1.386303 | 0.560839 | 56.083857 | 0.467052 | 24.557209 | 0.052115 | 5.21148 | 0.285235 | 4.697987 | 0.404542 | 8.934375 | 0.251247 | 16.805186 | true | true |
2024-03-21
|
2024-06-12
| 1 |
mlabonne/Beyonder-4x7B-v3 (Merge)
|
mlabonne_BigQwen2.5-52B-Instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/BigQwen2.5-52B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/BigQwen2.5-52B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__BigQwen2.5-52B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/BigQwen2.5-52B-Instruct
|
425b9bffc9871085cc0d42c34138ce776f96ba02
| 37.558281 |
apache-2.0
| 2 | 52 | true | false | false | true | 27.563508 | 0.792872 | 79.28718 | 0.7121 | 59.809607 | 0.186556 | 18.655589 | 0.302013 | 6.935123 | 0.411302 | 10.446094 | 0.551945 | 50.21609 | true | true |
2024-09-23
|
2024-09-25
| 1 |
mlabonne/BigQwen2.5-52B-Instruct (Merge)
|
mlabonne_BigQwen2.5-Echo-47B-Instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/BigQwen2.5-Echo-47B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/BigQwen2.5-Echo-47B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__BigQwen2.5-Echo-47B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/BigQwen2.5-Echo-47B-Instruct
|
f95fcf22f8ab87c2dbb1893b87c8a132820acb5e
| 30.322429 |
apache-2.0
| 3 | 47 | true | false | false | true | 8.523077 | 0.735669 | 73.566914 | 0.612511 | 44.522244 | 0.035498 | 3.549849 | 0.314597 | 8.612975 | 0.412479 | 10.193229 | 0.473404 | 41.489362 | true | true |
2024-09-23
|
2024-09-24
| 1 |
mlabonne/BigQwen2.5-Echo-47B-Instruct (Merge)
|
mlabonne_ChimeraLlama-3-8B-v2_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/ChimeraLlama-3-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/ChimeraLlama-3-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__ChimeraLlama-3-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/ChimeraLlama-3-8B-v2
|
d90a12b1574d7be084e53e0ad610282638ab29cf
| 20.133093 |
other
| 14 | 8 | true | false | false | false | 0.83741 | 0.446883 | 44.688316 | 0.50456 | 28.478796 | 0.09139 | 9.138973 | 0.285235 | 4.697987 | 0.379083 | 5.252083 | 0.356882 | 28.542405 | true | true |
2024-04-22
|
2024-08-25
| 1 |
mlabonne/ChimeraLlama-3-8B-v2 (Merge)
|
mlabonne_ChimeraLlama-3-8B-v3_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/ChimeraLlama-3-8B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/ChimeraLlama-3-8B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__ChimeraLlama-3-8B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/ChimeraLlama-3-8B-v3
|
c8c1787e1426e3979ae82134f4eb7fa332f58ae0
| 20.684542 |
other
| 15 | 8 | true | false | false | false | 0.82374 | 0.440788 | 44.078822 | 0.497819 | 27.646094 | 0.087613 | 8.761329 | 0.291946 | 5.592841 | 0.400354 | 8.377604 | 0.366855 | 29.650561 | true | true |
2024-05-01
|
2024-08-25
| 1 |
mlabonne/ChimeraLlama-3-8B-v3 (Merge)
|
mlabonne_Daredevil-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Daredevil-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Daredevil-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Daredevil-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Daredevil-8B
|
717953c83631cc9adf2dddccfff06739308f10f7
| 22.308941 |
other
| 36 | 8 | true | false | false | true | 1.511915 | 0.454777 | 45.477666 | 0.519441 | 31.626855 | 0.100453 | 10.045317 | 0.307886 | 7.718121 | 0.393875 | 7.534375 | 0.383062 | 31.451315 | true | true |
2024-05-25
|
2024-07-02
| 1 |
mlabonne/Daredevil-8B (Merge)
|
mlabonne_Daredevil-8B-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Daredevil-8B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Daredevil-8B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Daredevil-8B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Daredevil-8B-abliterated
|
034c0ce8ceeba075d1dff2bac1b113a017c79390
| 19.72576 |
other
| 33 | 8 | true | false | false | true | 1.198362 | 0.442637 | 44.263665 | 0.425427 | 19.865777 | 0.096677 | 9.667674 | 0.290268 | 5.369128 | 0.407021 | 9.177604 | 0.370096 | 30.010712 | false | true |
2024-05-26
|
2024-07-02
| 0 |
mlabonne/Daredevil-8B-abliterated
|
mlabonne_Hermes-3-Llama-3.1-70B-lorablated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Hermes-3-Llama-3.1-70B-lorablated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Hermes-3-Llama-3.1-70B-lorablated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Hermes-3-Llama-3.1-70B-lorablated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Hermes-3-Llama-3.1-70B-lorablated
|
4303ff3b524418e9aa5e787d60595a44a6173b02
| 31.544445 | 25 | 70 | true | false | false | false | 30.879892 | 0.342444 | 34.244361 | 0.669317 | 52.750073 | 0.212236 | 21.223565 | 0.365772 | 15.436242 | 0.502927 | 24.732552 | 0.467919 | 40.879876 | true | true |
2024-08-16
|
2024-11-27
| 1 |
mlabonne/Hermes-3-Llama-3.1-70B-lorablated (Merge)
|
|
mlabonne_Meta-Llama-3.1-8B-Instruct-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Meta-Llama-3.1-8B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated
|
aef878bdf42c119d007322967006fcdef5ae6ee1
| 23.290888 |
llama3.1
| 139 | 8 | true | false | false | true | 2.45405 | 0.73447 | 73.447009 | 0.487406 | 27.129165 | 0.072508 | 7.250755 | 0.256711 | 0.894855 | 0.364885 | 3.210677 | 0.350316 | 27.812869 | false | true |
2024-07-24
|
2024-10-13
| 2 |
meta-llama/Meta-Llama-3.1-8B
|
mlabonne_NeuralBeagle14-7B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/NeuralBeagle14-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/NeuralBeagle14-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__NeuralBeagle14-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/NeuralBeagle14-7B
|
1567ad618a0998139654cb355738bb9bc018ca64
| 18.947841 |
cc-by-nc-4.0
| 158 | 7 | true | false | false | true | 0.671707 | 0.493519 | 49.351942 | 0.462787 | 23.959695 | 0.054381 | 5.438066 | 0.281879 | 4.250559 | 0.431948 | 12.89349 | 0.26014 | 17.793292 | true | true |
2024-01-15
|
2024-06-27
| 2 |
mlabonne/Beagle14-7B (Merge)
|
mlabonne_NeuralDaredevil-8B-abliterated_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/NeuralDaredevil-8B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/NeuralDaredevil-8B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__NeuralDaredevil-8B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/NeuralDaredevil-8B-abliterated
|
2f4a5e8a8522f19dff345c7189b7891468763061
| 27.148976 |
llama3
| 157 | 8 | true | false | false | true | 2.707744 | 0.756077 | 75.607721 | 0.511057 | 30.307986 | 0.088369 | 8.836858 | 0.306208 | 7.494407 | 0.401938 | 9.075521 | 0.384142 | 31.571365 | false | true |
2024-05-27
|
2024-07-25
| 0 |
mlabonne/NeuralDaredevil-8B-abliterated
|
mlabonne_NeuralDaredevil-8B-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/NeuralDaredevil-8B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/NeuralDaredevil-8B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__NeuralDaredevil-8B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/NeuralDaredevil-8B-abliterated
|
89b01e3292e031ed85ad21545849182f5627021e
| 21.499914 |
llama3
| 157 | 8 | true | false | false | false | 0.985007 | 0.416233 | 41.623337 | 0.512396 | 29.763198 | 0.085347 | 8.534743 | 0.302852 | 7.04698 | 0.414958 | 10.903125 | 0.380153 | 31.128103 | false | true |
2024-05-27
|
2024-06-27
| 0 |
mlabonne/NeuralDaredevil-8B-abliterated
|
mlabonne_OrpoLlama-3-8B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/OrpoLlama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/OrpoLlama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__OrpoLlama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/OrpoLlama-3-8B
|
7f200e4c84ad0daa3ff6bc414012d8d0bacbf90e
| 14.980803 |
other
| 54 | 8 | true | false | false | true | 0.8903 | 0.365275 | 36.527525 | 0.442408 | 21.954108 | 0.045317 | 4.531722 | 0.279362 | 3.914989 | 0.357938 | 4.008854 | 0.270529 | 18.947621 | false | true |
2024-04-18
|
2024-06-12
| 1 |
meta-llama/Meta-Llama-3-8B
|
mlabonne_phixtral-2x2_8_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/phixtral-2x2_8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/phixtral-2x2_8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__phixtral-2x2_8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/phixtral-2x2_8
|
7744a977d83f132ae5808d8c3b70157031f7de44
| 15.464997 |
mit
| 146 | 4 | true | true | false | true | 0.960951 | 0.343118 | 34.311848 | 0.488859 | 28.502645 | 0.030211 | 3.021148 | 0.265101 | 2.013423 | 0.364354 | 7.710938 | 0.25507 | 17.229979 | false | true |
2024-01-07
|
2024-06-12
| 0 |
mlabonne/phixtral-2x2_8
|
mmnga_Llama-3-70B-japanese-suzume-vector-v0.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mmnga/Llama-3-70B-japanese-suzume-vector-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mmnga/Llama-3-70B-japanese-suzume-vector-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mmnga__Llama-3-70B-japanese-suzume-vector-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mmnga/Llama-3-70B-japanese-suzume-vector-v0.1
|
16f98b2d45684af2c4a9ff5da75b00ef13cca808
| 30.883569 |
llama3
| 4 | 70 | true | false | false | true | 16.097125 | 0.464893 | 46.489315 | 0.654176 | 50.022661 | 0.26284 | 26.283988 | 0.286074 | 4.809843 | 0.414063 | 10.757813 | 0.52244 | 46.937796 | false | false |
2024-04-28
|
2024-09-19
| 0 |
mmnga/Llama-3-70B-japanese-suzume-vector-v0.1
|
moeru-ai_L3.1-Moe-2x8B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/moeru-ai/L3.1-Moe-2x8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">moeru-ai/L3.1-Moe-2x8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/moeru-ai__L3.1-Moe-2x8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
moeru-ai/L3.1-Moe-2x8B-v0.2
|
1a0b4d4d1e839e332c67c9c16a2fc1f7ccc7f81e
| 28.588568 |
llama3.1
| 6 | 13 | true | true | false | true | 1.926568 | 0.734795 | 73.479479 | 0.525569 | 32.945891 | 0.152568 | 15.256798 | 0.300336 | 6.711409 | 0.419854 | 11.381771 | 0.385805 | 31.756058 | true | false |
2024-10-25
|
2024-10-25
| 1 |
moeru-ai/L3.1-Moe-2x8B-v0.2 (Merge)
|
moeru-ai_L3.1-Moe-4x8B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/moeru-ai/L3.1-Moe-4x8B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">moeru-ai/L3.1-Moe-4x8B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/moeru-ai__L3.1-Moe-4x8B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
moeru-ai/L3.1-Moe-4x8B-v0.1
|
f8d477fad4c02c099c80ef38865c01e2c882e996
| 19.126854 |
llama3.1
| 3 | 24 | true | true | false | true | 5.805487 | 0.433219 | 43.321941 | 0.493928 | 27.856765 | 0.111027 | 11.102719 | 0.259228 | 1.230425 | 0.360917 | 3.98125 | 0.345412 | 27.268026 | true | false |
2024-10-23
|
2024-10-23
| 1 |
moeru-ai/L3.1-Moe-4x8B-v0.1 (Merge)
|
moeru-ai_L3.1-Moe-4x8B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/moeru-ai/L3.1-Moe-4x8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">moeru-ai/L3.1-Moe-4x8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/moeru-ai__L3.1-Moe-4x8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
moeru-ai/L3.1-Moe-4x8B-v0.2
|
fab49d865eb51f00e955c5624712184c39d207c9
| 17.467109 |
llama3.1
| 2 | 24 | true | true | false | true | 3.422218 | 0.540655 | 54.065546 | 0.446626 | 21.337007 | 0.05287 | 5.287009 | 0.266779 | 2.237136 | 0.323396 | 2.291146 | 0.276263 | 19.584811 | true | false |
2024-10-30
|
2024-10-30
| 1 |
moeru-ai/L3.1-Moe-4x8B-v0.2 (Merge)
|
monsterapi_Llama-3_1-8B-Instruct-orca-ORPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/monsterapi/Llama-3_1-8B-Instruct-orca-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">monsterapi/Llama-3_1-8B-Instruct-orca-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/monsterapi__Llama-3_1-8B-Instruct-orca-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
monsterapi/Llama-3_1-8B-Instruct-orca-ORPO
|
5206a32e0bd3067aef1ce90f5528ade7d866253f
| 4.832138 |
apache-2.0
| 2 | 16 | true | false | false | true | 1.53268 | 0.227289 | 22.728915 | 0.286536 | 1.340469 | 0 | 0 | 0.249161 | 0 | 0.344479 | 3.059896 | 0.116772 | 1.863549 | false | false |
2024-08-01
|
2024-08-30
| 2 |
meta-llama/Meta-Llama-3.1-8B
|
monsterapi_gemma-2-2b-LoRA-MonsterInstruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/monsterapi/gemma-2-2b-LoRA-MonsterInstruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">monsterapi/gemma-2-2b-LoRA-MonsterInstruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/monsterapi__gemma-2-2b-LoRA-MonsterInstruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
monsterapi/gemma-2-2b-LoRA-MonsterInstruct
|
6422e27e96e15cf93b966c973aacc15f8a27a458
| 11.865291 |
gemma
| 0 | 2 | true | false | false | true | 1.338688 | 0.390255 | 39.025452 | 0.364969 | 11.965057 | 0.011329 | 1.132931 | 0.270134 | 2.684564 | 0.364385 | 5.414844 | 0.19872 | 10.968898 | false | false |
2024-08-03
|
2024-08-05
| 0 |
monsterapi/gemma-2-2b-LoRA-MonsterInstruct
|
mosaicml_mpt-7b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MPTForCausalLM
|
<a target="_blank" href="https://huggingface.co/mosaicml/mpt-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mosaicml/mpt-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mosaicml__mpt-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mosaicml/mpt-7b
|
039e37745f00858f0e01e988383a8c4393b1a4f5
| 5.994265 |
apache-2.0
| 1,163 | 7 | true | false | false | false | 0.643503 | 0.215199 | 21.519901 | 0.329974 | 6.550601 | 0.013595 | 1.359517 | 0.260067 | 1.342282 | 0.36724 | 2.904948 | 0.120595 | 2.288342 | false | true |
2023-05-05
|
2024-06-08
| 0 |
mosaicml/mpt-7b
|
natong19_Mistral-Nemo-Instruct-2407-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/natong19/Mistral-Nemo-Instruct-2407-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">natong19/Mistral-Nemo-Instruct-2407-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/natong19__Mistral-Nemo-Instruct-2407-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
natong19/Mistral-Nemo-Instruct-2407-abliterated
|
9c7087f62e6ab10ec4aeeb268e25cb3d4000696b
| 23.872107 |
apache-2.0
| 9 | 12 | true | false | false | true | 1.237857 | 0.639224 | 63.922393 | 0.504845 | 29.915044 | 0.063444 | 6.344411 | 0.286913 | 4.9217 | 0.403333 | 10.15 | 0.351812 | 27.979093 | false | false |
2024-08-15
|
2024-09-21
| 0 |
natong19/Mistral-Nemo-Instruct-2407-abliterated
|
natong19_Qwen2-7B-Instruct-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/natong19/Qwen2-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">natong19/Qwen2-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/natong19__Qwen2-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
natong19/Qwen2-7B-Instruct-abliterated
|
127962453ae87879719a82a97384ac1859787a25
| 25.758544 |
apache-2.0
| 5 | 7 | true | false | false | true | 1.075836 | 0.583695 | 58.36946 | 0.555304 | 37.746834 | 0.111027 | 11.102719 | 0.301174 | 6.823266 | 0.403427 | 8.928385 | 0.384225 | 31.5806 | false | false |
2024-06-14
|
2024-07-29
| 0 |
natong19/Qwen2-7B-Instruct-abliterated
|
nazimali_Mistral-Nemo-Kurdish_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nazimali/Mistral-Nemo-Kurdish" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nazimali/Mistral-Nemo-Kurdish</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nazimali__Mistral-Nemo-Kurdish-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nazimali/Mistral-Nemo-Kurdish
|
1eb544577a2874d8df0b77ca83ff1c88dd20f481
| 19.368945 |
apache-2.0
| 2 | 12 | true | false | false | false | 1.849727 | 0.340121 | 34.012088 | 0.513332 | 29.855897 | 0.089124 | 8.912387 | 0.301174 | 6.823266 | 0.411573 | 11.779948 | 0.323471 | 24.830083 | false | false |
2024-10-09
|
2024-10-14
| 1 |
nazimali/Mistral-Nemo-Kurdish (Merge)
|
nazimali_Mistral-Nemo-Kurdish-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nazimali/Mistral-Nemo-Kurdish-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nazimali/Mistral-Nemo-Kurdish-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nazimali__Mistral-Nemo-Kurdish-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nazimali/Mistral-Nemo-Kurdish-Instruct
|
512140572f11203441e60ca26b5ede2b9979cb1d
| 18.555958 |
apache-2.0
| 2 | 12 | true | false | false | true | 1.702117 | 0.496392 | 49.63918 | 0.469942 | 25.561423 | 0.004532 | 0.453172 | 0.282718 | 4.362416 | 0.397875 | 8.401042 | 0.306267 | 22.918514 | false | false |
2024-10-09
|
2024-10-14
| 1 |
nazimali/Mistral-Nemo-Kurdish-Instruct (Merge)
|
nazimali_Mistral-Nemo-Kurdish-Instruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nazimali/Mistral-Nemo-Kurdish-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nazimali/Mistral-Nemo-Kurdish-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nazimali__Mistral-Nemo-Kurdish-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nazimali/Mistral-Nemo-Kurdish-Instruct
|
512140572f11203441e60ca26b5ede2b9979cb1d
| 18.589106 |
apache-2.0
| 2 | 12 | true | false | false | true | 1.751721 | 0.486 | 48.600048 | 0.472144 | 26.021741 | 0.003021 | 0.302115 | 0.284396 | 4.58613 | 0.400573 | 8.838281 | 0.308677 | 23.186318 | false | false |
2024-10-09
|
2024-10-14
| 1 |
nazimali/Mistral-Nemo-Kurdish-Instruct (Merge)
|
nbeerbower_Flammades-Mistral-Nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Flammades-Mistral-Nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Flammades-Mistral-Nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Flammades-Mistral-Nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Flammades-Mistral-Nemo-12B
|
ddc76d1976af06aedc7f06bbffcaa34166c1cbdd
| 22.466019 |
apache-2.0
| 2 | 12 | true | false | false | false | 1.62742 | 0.38416 | 38.415959 | 0.529961 | 32.393772 | 0.069486 | 6.94864 | 0.303691 | 7.158837 | 0.480625 | 20.311458 | 0.366107 | 29.56745 | false | false |
2024-10-05
|
2024-10-06
| 1 |
nbeerbower/Flammades-Mistral-Nemo-12B (Merge)
|
nbeerbower_Gemma2-Gutenberg-Doppel-9B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Gemma2-Gutenberg-Doppel-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Gemma2-Gutenberg-Doppel-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Gemma2-Gutenberg-Doppel-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Gemma2-Gutenberg-Doppel-9B
|
f425bc69783891088e89e0afe44ec62b730567ba
| 29.835999 |
gemma
| 4 | 9 | true | false | false | false | 1.947316 | 0.717109 | 71.710949 | 0.587011 | 41.083063 | 0.035498 | 3.549849 | 0.329698 | 10.626398 | 0.460781 | 17.297656 | 0.412733 | 34.748079 | false | false |
2024-09-29
|
2024-10-01
| 1 |
nbeerbower/Gemma2-Gutenberg-Doppel-9B (Merge)
|
nbeerbower_Gutensuppe-mistral-nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Gutensuppe-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Gutensuppe-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Gutensuppe-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Gutensuppe-mistral-nemo-12B
|
6ee13f347071bc3c4ee95c9dc3488a4093927143
| 22.080224 | 5 | 12 | false | false | false | false | 1.556249 | 0.291611 | 29.16107 | 0.548683 | 35.569348 | 0.120091 | 12.009063 | 0.337248 | 11.63311 | 0.429031 | 14.328906 | 0.368019 | 29.779846 | false | false |
2024-08-23
|
2024-09-03
| 1 |
nbeerbower/Gutensuppe-mistral-nemo-12B (Merge)
|
|
nbeerbower_Hermes2-Gutenberg2-Mistral-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Hermes2-Gutenberg2-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Hermes2-Gutenberg2-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Hermes2-Gutenberg2-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Hermes2-Gutenberg2-Mistral-7B
|
5eec0dfd29999ef1d7775010b7e9c7be9ed89bfd
| 19.376449 |
apache-2.0
| 2 | 7 | true | false | false | false | 0.581121 | 0.372145 | 37.21448 | 0.498145 | 28.907335 | 0.058157 | 5.81571 | 0.28943 | 5.257271 | 0.462302 | 16.921094 | 0.299285 | 22.142804 | false | false |
2024-09-30
|
2024-10-01
| 1 |
nbeerbower/Hermes2-Gutenberg2-Mistral-7B (Merge)
|
nbeerbower_Llama-3.1-Nemotron-lorablated-70B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Llama-3.1-Nemotron-lorablated-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Llama-3.1-Nemotron-lorablated-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Llama-3.1-Nemotron-lorablated-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Llama-3.1-Nemotron-lorablated-70B
|
f335a582cdb7fb0e63a7343a908766ebd0ed9882
| 40.725393 |
llama3.1
| 12 | 70 | true | false | false | false | 41.991923 | 0.72288 | 72.287974 | 0.682505 | 54.182581 | 0.324773 | 32.477341 | 0.39094 | 18.791946 | 0.468167 | 18.354167 | 0.534325 | 48.258348 | true | false |
2024-10-17
|
2024-11-27
| 1 |
nbeerbower/Llama-3.1-Nemotron-lorablated-70B (Merge)
|
nbeerbower_Llama3.1-Gutenberg-Doppel-70B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Llama3.1-Gutenberg-Doppel-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Llama3.1-Gutenberg-Doppel-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Llama3.1-Gutenberg-Doppel-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Llama3.1-Gutenberg-Doppel-70B
|
5de156e97f776ce1b88ce5b2e2dc1e7709205a82
| 35.803048 |
llama3.1
| 5 | 70 | true | false | false | true | 9.993593 | 0.709216 | 70.921599 | 0.666089 | 52.556779 | 0.145015 | 14.501511 | 0.344799 | 12.639821 | 0.489719 | 22.68151 | 0.473654 | 41.517066 | false | false |
2024-10-11
|
2024-10-12
| 1 |
nbeerbower/Llama3.1-Gutenberg-Doppel-70B (Merge)
|
nbeerbower_Lyra-Gutenberg-mistral-nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Lyra-Gutenberg-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Lyra-Gutenberg-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Lyra-Gutenberg-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Lyra-Gutenberg-mistral-nemo-12B
|
5c506391eb02075e02f4cf5953b443505d646bce
| 22.703718 |
cc-by-nc-4.0
| 17 | 12 | true | false | false | true | 1.918602 | 0.349488 | 34.948825 | 0.558625 | 36.992432 | 0.09139 | 9.138973 | 0.333893 | 11.185682 | 0.435667 | 14.758333 | 0.362783 | 29.198064 | false | false |
2024-08-23
|
2024-09-03
| 1 |
nbeerbower/Lyra-Gutenberg-mistral-nemo-12B (Merge)
|
nbeerbower_Lyra4-Gutenberg-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Lyra4-Gutenberg-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Lyra4-Gutenberg-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Lyra4-Gutenberg-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Lyra4-Gutenberg-12B
|
cb6911be3475da99a810071c04803d6edfb5965b
| 19.818943 |
cc-by-nc-4.0
| 19 | 12 | true | false | false | false | 1.690534 | 0.221219 | 22.121859 | 0.538669 | 34.235593 | 0.128399 | 12.839879 | 0.318792 | 9.17226 | 0.403792 | 11.973958 | 0.357131 | 28.570109 | false | false |
2024-09-09
|
2024-09-12
| 1 |
nbeerbower/Lyra4-Gutenberg-12B (Merge)
|
nbeerbower_Lyra4-Gutenberg2-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Lyra4-Gutenberg2-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Lyra4-Gutenberg2-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Lyra4-Gutenberg2-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Lyra4-Gutenberg2-12B
|
6a5f117695cc729de16da87654b979e6df72ed2f
| 19.932294 |
cc-by-nc-4.0
| 8 | 12 | true | false | false | false | 1.80934 | 0.258513 | 25.851297 | 0.534453 | 33.73064 | 0.116314 | 11.63142 | 0.312919 | 8.389262 | 0.397219 | 11.485677 | 0.356549 | 28.505467 | false | false |
2024-09-29
|
2024-10-01
| 1 |
nbeerbower/Lyra4-Gutenberg2-12B (Merge)
|
nbeerbower_Mahou-1.5-mistral-nemo-12B-lorablated_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mahou-1.5-mistral-nemo-12B-lorablated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated
|
8c9eecaace50659647c7d8b569237ad068a6c837
| 26.53481 |
apache-2.0
| 2 | 12 | true | false | false | true | 1.405424 | 0.682488 | 68.248802 | 0.549604 | 36.077381 | 0.058157 | 5.81571 | 0.279362 | 3.914989 | 0.452167 | 16.554167 | 0.35738 | 28.597813 | true | false |
2024-10-19
|
2024-10-19
| 1 |
nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated (Merge)
|
nbeerbower_Mistral-Gutenberg-Doppel-7B-FFT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Gutenberg-Doppel-7B-FFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT
|
5735876465b6f2523fdedb73120c3f97d04556d3
| 18.338276 |
apache-2.0
| 1 | 7 | true | false | false | true | 0.436816 | 0.57168 | 57.167981 | 0.407625 | 17.346575 | 0.024924 | 2.492447 | 0.283557 | 4.474273 | 0.405938 | 9.342188 | 0.272856 | 19.206191 | false | false |
2024-11-18
|
2024-11-18
| 1 |
nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT (Merge)
|
nbeerbower_Mistral-Nemo-Gutenberg-Doppel-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Gutenberg-Doppel-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B
|
0eaaac89d4b53e94d5b78220b24439a026ee29e6
| 21.475046 |
apache-2.0
| 3 | 12 | true | false | false | false | 1.776771 | 0.356707 | 35.670687 | 0.527461 | 32.421527 | 0.117825 | 11.782477 | 0.316275 | 8.836689 | 0.413219 | 11.485677 | 0.357879 | 28.653221 | false | false |
2024-09-26
|
2024-09-26
| 1 |
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B (Merge)
|
nbeerbower_Mistral-Nemo-Gutenberg-Doppel-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Gutenberg-Doppel-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
|
adc1ccd9d83d24e41bed895f989803af87ea2d2c
| 24.717981 |
apache-2.0
| 7 | 12 | true | false | false | true | 1.404857 | 0.653587 | 65.358693 | 0.53745 | 34.357413 | 0.044562 | 4.456193 | 0.270973 | 2.796421 | 0.423302 | 13.046094 | 0.354638 | 28.29307 | false | false |
2024-10-04
|
2024-10-09
| 1 |
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2 (Merge)
|
nbeerbower_Mistral-Nemo-Moderne-12B-FFT-experimental_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Moderne-12B-FFT-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental
|
e52f7b7c3ade2a6212f29dd1054332cee21ab85d
| 17.175357 |
apache-2.0
| 1 | 12 | true | false | false | true | 1.217157 | 0.335225 | 33.522498 | 0.523409 | 32.07154 | 0.020393 | 2.039275 | 0.28104 | 4.138702 | 0.37149 | 4.002865 | 0.345495 | 27.277261 | false | false |
2024-11-19
|
2024-11-26
| 1 |
nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental (Merge)
|
nbeerbower_Mistral-Nemo-Prism-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Prism-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Prism-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Prism-12B
|
a39e1c8c083c172aaa3ca81faf8ba3b4799a888f
| 27.344954 |
apache-2.0
| 3 | 12 | true | false | false | true | 0.959288 | 0.68581 | 68.581032 | 0.547519 | 35.918008 | 0.052115 | 5.21148 | 0.307886 | 7.718121 | 0.462615 | 17.960156 | 0.358128 | 28.680925 | false | false |
2024-11-12
|
2024-11-12
| 1 |
nbeerbower/Mistral-Nemo-Prism-12B (Merge)
|
nbeerbower_Mistral-Nemo-Prism-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Prism-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Prism-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Prism-12B-v2
|
d7545999274cb56b5f961580b5234e8a647e023a
| 27.541764 |
apache-2.0
| 2 | 12 | true | false | false | true | 0.935552 | 0.697401 | 69.740067 | 0.549188 | 36.199788 | 0.057402 | 5.740181 | 0.305369 | 7.38255 | 0.459979 | 17.664063 | 0.356715 | 28.523936 | false | false |
2024-11-12
|
2024-11-26
| 1 |
nbeerbower/Mistral-Nemo-Prism-12B-v2 (Merge)
|
nbeerbower_Mistral-Nemo-Prism-12B-v7_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Prism-12B-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Prism-12B-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Prism-12B-v7
|
0c9da9f3903be14fda1fcae245c22f873442b86f
| 27.342442 |
apache-2.0
| 5 | 12 | true | false | false | true | 0.975511 | 0.696152 | 69.615177 | 0.55211 | 36.440017 | 0.045317 | 4.531722 | 0.299497 | 6.599553 | 0.463885 | 18.085677 | 0.359043 | 28.782506 | false | false |
2024-11-13
|
2024-11-26
| 1 |
nbeerbower/Mistral-Nemo-Prism-12B-v7 (Merge)
|
nbeerbower_Mistral-Small-Drummer-22B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Small-Drummer-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Small-Drummer-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Small-Drummer-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Small-Drummer-22B
|
53b21ece0c64ffc8aba81f294ad19e2c06e9852c
| 29.74388 |
other
| 11 | 22 | true | false | false | false | 1.612722 | 0.633129 | 63.312899 | 0.57932 | 40.12177 | 0.18429 | 18.429003 | 0.343121 | 12.416107 | 0.406365 | 9.795573 | 0.409491 | 34.387928 | false | false |
2024-09-26
|
2024-10-01
| 1 |
nbeerbower/Mistral-Small-Drummer-22B (Merge)
|
nbeerbower_Mistral-Small-Gutenberg-Doppel-22B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Small-Gutenberg-Doppel-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Small-Gutenberg-Doppel-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
|
d8091aad5f882b714321e4d51f504cc61996ee67
| 27.858747 |
other
| 10 | 22 | true | false | false | false | 1.588603 | 0.489323 | 48.932277 | 0.585893 | 40.931345 | 0.21148 | 21.148036 | 0.346477 | 12.863535 | 0.397063 | 8.566146 | 0.4124 | 34.711141 | false | false |
2024-09-25
|
2024-09-25
| 1 |
nbeerbower/Mistral-Small-Gutenberg-Doppel-22B (Merge)
|
nbeerbower_Nemo-Loony-12B-experimental_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Nemo-Loony-12B-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Nemo-Loony-12B-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Nemo-Loony-12B-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Nemo-Loony-12B-experimental
|
7b06f30502a9b58c028ac1079e1b3d2988b76866
| 10.431803 | 0 | 12 | false | false | false | true | 1.237582 | 0.373444 | 37.344357 | 0.382222 | 12.974588 | 0.01284 | 1.283988 | 0.270134 | 2.684564 | 0.334063 | 1.757812 | 0.15891 | 6.545508 | false | false |
2024-11-26
|
2024-11-26
| 1 |
nbeerbower/Nemo-Loony-12B-experimental (Merge)
|
|
nbeerbower_Qwen2.5-Gutenberg-Doppel-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Qwen2.5-Gutenberg-Doppel-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Qwen2.5-Gutenberg-Doppel-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Qwen2.5-Gutenberg-Doppel-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B
|
11a5060f9e7315ea07241106f086ac4694dded60
| 32.302115 |
apache-2.0
| 11 | 14 | true | false | false | true | 1.690612 | 0.809083 | 80.908323 | 0.638174 | 48.238909 | 0 | 0 | 0.333054 | 11.073826 | 0.410063 | 10.024479 | 0.492104 | 43.567154 | false | false |
2024-11-11
|
2024-11-11
| 1 |
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B (Merge)
|
nbeerbower_SmolNemo-12B-FFT-experimental_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/SmolNemo-12B-FFT-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/SmolNemo-12B-FFT-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__SmolNemo-12B-FFT-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/SmolNemo-12B-FFT-experimental
|
d8d7a90ae9b9cb79cdc0912a685c3cb8d7a25560
| 8.320055 |
apache-2.0
| 0 | 12 | true | false | false | true | 1.225415 | 0.334801 | 33.480055 | 0.333609 | 6.542439 | 0.002266 | 0.226586 | 0.260067 | 1.342282 | 0.384698 | 5.920573 | 0.121676 | 2.408392 | false | false |
2024-11-25
|
2024-11-26
| 1 |
nbeerbower/SmolNemo-12B-FFT-experimental (Merge)
|
nbeerbower_Stella-mistral-nemo-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Stella-mistral-nemo-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Stella-mistral-nemo-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Stella-mistral-nemo-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Stella-mistral-nemo-12B-v2
|
b81bab28f7dcb25a0aa0fe4dcf957f3083ee6b43
| 22.430369 | 3 | 12 | false | false | false | false | 1.740872 | 0.327431 | 32.743122 | 0.548375 | 35.364516 | 0.112538 | 11.253776 | 0.332215 | 10.961969 | 0.430396 | 14.432812 | 0.368434 | 29.82602 | false | false |
2024-09-07
|
2024-09-14
| 1 |
nbeerbower/Stella-mistral-nemo-12B-v2 (Merge)
|
|
nbeerbower_gemma2-gutenberg-27B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/gemma2-gutenberg-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/gemma2-gutenberg-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__gemma2-gutenberg-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/gemma2-gutenberg-27B
|
d4febe52e8b7b13a98126dbf1716ed1329f48922
| 10.108961 |
gemma
| 4 | 27 | true | false | false | false | 7.695458 | 0.294708 | 29.470804 | 0.379657 | 13.091525 | 0 | 0 | 0.272651 | 3.020134 | 0.372729 | 4.157813 | 0.198221 | 10.91349 | false | false |
2024-09-09
|
2024-09-23
| 1 |
nbeerbower/gemma2-gutenberg-27B (Merge)
|
nbeerbower_gemma2-gutenberg-9B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/gemma2-gutenberg-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/gemma2-gutenberg-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__gemma2-gutenberg-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/gemma2-gutenberg-9B
|
ebdab2d41f257fc9e7c858498653644d13386ce5
| 22.649257 |
gemma
| 12 | 9 | true | false | false | false | 2.809609 | 0.279595 | 27.959481 | 0.59509 | 42.355611 | 0.016616 | 1.661631 | 0.338087 | 11.744966 | 0.45951 | 16.705469 | 0.419215 | 35.468381 | false | false |
2024-07-14
|
2024-08-03
| 1 |
nbeerbower/gemma2-gutenberg-9B (Merge)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.