eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
allknowingroger_MultiMash11-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash11-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash11-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash11-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash11-13B-slerp | 1134a0adabef4a26e1d49c302baff74c4a7e9f46 | 20.614676 | apache-2.0 | 0 | 12 | true | false | false | false | 0.985739 | 0.425101 | 42.510095 | 0.519386 | 32.596703 | 0.070242 | 7.024169 | 0.282718 | 4.362416 | 0.437281 | 14.026823 | 0.308511 | 23.167849 | true | false | 2024-05-27 | 2024-06-26 | 1 | allknowingroger/MultiMash11-13B-slerp (Merge) |
allknowingroger_MultiMash2-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash2-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash2-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash2-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash2-12B-slerp | e44e9563368699f753a4474b068c059d233ddee3 | 19.827555 | apache-2.0 | 0 | 12 | true | false | false | false | 0.82014 | 0.426075 | 42.607504 | 0.513397 | 31.61795 | 0.063444 | 6.344411 | 0.279362 | 3.914989 | 0.422802 | 11.783594 | 0.304272 | 22.696882 | true | false | 2024-05-20 | 2024-06-26 | 1 | allknowingroger/MultiMash2-12B-slerp (Merge) |
allknowingroger_MultiMash5-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash5-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash5-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash5-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash5-12B-slerp | 15ef0301c7ce939208d55ad13fa840662f92bce6 | 19.602894 | apache-2.0 | 0 | 12 | true | false | false | false | 0.804167 | 0.41416 | 41.415998 | 0.514453 | 31.856364 | 0.064199 | 6.41994 | 0.277685 | 3.691275 | 0.420292 | 11.703125 | 0.302776 | 22.530659 | true | false | 2024-05-21 | 2024-06-26 | 1 | allknowingroger/MultiMash5-12B-slerp (Merge) |
allknowingroger_MultiMash6-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash6-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash6-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash6-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash6-12B-slerp | a04856a12b85e986e1b540cf0c7510e9ce2df09b | 20.263839 | apache-2.0 | 0 | 12 | true | false | false | false | 0.824573 | 0.430047 | 43.004672 | 0.519592 | 32.40388 | 0.071752 | 7.175227 | 0.274329 | 3.243848 | 0.430583 | 12.522917 | 0.309092 | 23.232491 | true | false | 2024-05-22 | 2024-06-26 | 1 | allknowingroger/MultiMash6-12B-slerp (Merge) |
allknowingroger_MultiMash7-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash7-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash7-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash7-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash7-12B-slerp | 5f91dd41fb4b58e76c52b03ed15477a046b079df | 19.779705 | apache-2.0 | 0 | 12 | true | false | false | false | 0.821771 | 0.421279 | 42.127887 | 0.511114 | 31.29815 | 0.068731 | 6.873112 | 0.278523 | 3.803132 | 0.427948 | 12.026823 | 0.302942 | 22.549128 | true | false | 2024-05-22 | 2024-06-26 | 1 | allknowingroger/MultiMash7-12B-slerp (Merge) |
allknowingroger_MultiMash8-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash8-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash8-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash8-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash8-13B-slerp | 5590ccd99f74301951f450f9d0271a99e97728c8 | 21.024512 | apache-2.0 | 0 | 12 | true | false | false | false | 1.725907 | 0.43207 | 43.207024 | 0.517848 | 32.272997 | 0.074018 | 7.401813 | 0.288591 | 5.145414 | 0.442396 | 14.499479 | 0.312583 | 23.620346 | true | false | 2024-05-26 | 2024-09-02 | 1 | allknowingroger/MultiMash8-13B-slerp (Merge) |
allknowingroger_MultiMash9-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash9-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash9-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash9-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash9-13B-slerp | 56dac45f387669baa04a8997ebb9ea63c65fbbd1 | 20.64297 | apache-2.0 | 0 | 12 | true | false | false | false | 0.865731 | 0.418781 | 41.878106 | 0.519358 | 32.552612 | 0.07855 | 7.854985 | 0.280201 | 4.026846 | 0.439823 | 14.211198 | 0.310007 | 23.334072 | true | false | 2024-05-26 | 2024-06-26 | 1 | allknowingroger/MultiMash9-13B-slerp (Merge) |
allknowingroger_MultiMerge-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMerge-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMerge-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMerge-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMerge-7B-slerp | a026bbea09f0b1880deed62b9081e3708be0dec2 | 19.554835 | apache-2.0 | 0 | 7 | true | false | false | false | 0.613172 | 0.394776 | 39.477586 | 0.514022 | 31.803983 | 0.067221 | 6.722054 | 0.282718 | 4.362416 | 0.427979 | 12.330729 | 0.30369 | 22.63224 | true | false | 2024-04-11 | 2024-06-26 | 1 | allknowingroger/MultiMerge-7B-slerp (Merge) |
allknowingroger_Multimash3-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Multimash3-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Multimash3-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Multimash3-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Multimash3-12B-slerp | 0b90bf0b5230d02b4ba63879fc3bf0b85d46c3ce | 20.483321 | apache-2.0 | 0 | 12 | true | false | false | false | 0.844658 | 0.44371 | 44.371047 | 0.517662 | 32.150891 | 0.063444 | 6.344411 | 0.280201 | 4.026846 | 0.434396 | 13.032813 | 0.306765 | 22.973921 | true | false | 2024-05-21 | 2024-06-26 | 1 | allknowingroger/Multimash3-12B-slerp (Merge) |
allknowingroger_Multimerge-19B-pass_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Multimerge-19B-pass" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Multimerge-19B-pass</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Multimerge-19B-pass-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Multimerge-19B-pass | e75918ed5601f400f62601cf6c0887aa936e8a52 | 4.536203 | 0 | 19 | false | false | false | false | 1.965379 | 0.177305 | 17.730511 | 0.289178 | 2.080374 | 0 | 0 | 0.259228 | 1.230425 | 0.342958 | 4.303125 | 0.116855 | 1.872784 | false | false | 2024-06-26 | 0 | Removed |
||
allknowingroger_MultiverseEx26-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiverseEx26-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiverseEx26-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiverseEx26-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiverseEx26-7B-slerp | 43f18d84e025693f00e9be335bf12fce96089b2f | 19.683311 | apache-2.0 | 1 | 7 | true | false | false | false | 0.605493 | 0.393852 | 39.385165 | 0.513359 | 31.663775 | 0.074773 | 7.477341 | 0.282718 | 4.362416 | 0.429313 | 12.597396 | 0.303524 | 22.613771 | true | false | 2024-03-30 | 2024-06-26 | 1 | allknowingroger/MultiverseEx26-7B-slerp (Merge) |
allknowingroger_NeuralWestSeverus-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/NeuralWestSeverus-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/NeuralWestSeverus-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__NeuralWestSeverus-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/NeuralWestSeverus-7B-slerp | 5ee5d6a11ffc4f9733e78994169a2e1614d5e16e | 20.687959 | apache-2.0 | 0 | 7 | true | false | false | false | 0.584046 | 0.41356 | 41.356046 | 0.524428 | 33.414467 | 0.074018 | 7.401813 | 0.270973 | 2.796421 | 0.452875 | 15.409375 | 0.313747 | 23.749631 | true | false | 2024-05-16 | 2024-06-26 | 1 | allknowingroger/NeuralWestSeverus-7B-slerp (Merge) |
allknowingroger_Neuralcoven-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Neuralcoven-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Neuralcoven-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Neuralcoven-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Neuralcoven-7B-slerp | 129b40a7fd816f679ef5d4ab29fc77345f33a7b1 | 20.376259 | apache-2.0 | 0 | 7 | true | false | false | false | 0.631911 | 0.385858 | 38.585841 | 0.530287 | 33.799135 | 0.079305 | 7.930514 | 0.285235 | 4.697987 | 0.429 | 11.758333 | 0.329372 | 25.485742 | true | false | 2024-05-17 | 2024-06-26 | 1 | allknowingroger/Neuralcoven-7B-slerp (Merge) |
allknowingroger_Neuralmultiverse-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Neuralmultiverse-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Neuralmultiverse-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Neuralmultiverse-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Neuralmultiverse-7B-slerp | a65fe05e26e10a488b08264ac8ed73a49c3f263a | 19.386207 | apache-2.0 | 0 | 7 | true | false | false | false | 0.61716 | 0.376915 | 37.691547 | 0.516572 | 32.10018 | 0.066465 | 6.646526 | 0.284396 | 4.58613 | 0.428042 | 12.605208 | 0.304189 | 22.687648 | true | false | 2024-05-17 | 2024-06-26 | 1 | allknowingroger/Neuralmultiverse-7B-slerp (Merge) |
allknowingroger_Ph3della5-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3della5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3della5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3della5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3della5-14B | 9c6819a910d4da414dd67c10da3bff3f986fefa5 | 30.117271 | apache-2.0 | 0 | 13 | true | false | false | false | 1.045925 | 0.479856 | 47.985567 | 0.633175 | 48.414364 | 0.155589 | 15.558912 | 0.342282 | 12.304251 | 0.438615 | 14.360156 | 0.478723 | 42.080378 | true | false | 2024-09-05 | 2024-10-08 | 1 | allknowingroger/Ph3della5-14B (Merge) |
allknowingroger_Ph3merge-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3merge-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3merge-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3merge-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3merge-14B | 6d0ddaa4e0cf4c82d7149cc726b08be5753a760a | 23.532275 | apache-2.0 | 0 | 13 | true | false | false | false | 2.012332 | 0.270129 | 27.012881 | 0.638088 | 48.882424 | 0.001511 | 0.151057 | 0.338087 | 11.744966 | 0.433438 | 13.279688 | 0.461104 | 40.122636 | true | false | 2024-08-30 | 2024-09-02 | 1 | allknowingroger/Ph3merge-14B (Merge) |
allknowingroger_Ph3merge2-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3merge2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3merge2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3merge2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3merge2-14B | 2256ab821e286a1d8a4f0d42e00a50013e119671 | 7.962731 | 0 | 13 | false | false | false | false | 3.05266 | 0.170611 | 17.061065 | 0.360694 | 10.549968 | 0 | 0 | 0.291107 | 5.480984 | 0.391083 | 6.652083 | 0.172291 | 8.032284 | false | false | 2024-10-08 | 0 | Removed |
||
allknowingroger_Ph3merge3-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3merge3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3merge3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3merge3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3merge3-14B | 90a036f7f136932ea525b5fd26cf2f54a66141af | 7.931824 | 0 | 13 | false | false | false | false | 1.989257 | 0.164516 | 16.451571 | 0.359743 | 10.39138 | 0 | 0 | 0.285235 | 4.697987 | 0.408198 | 8.858073 | 0.164727 | 7.191933 | false | false | 2024-09-02 | 0 | Removed |
||
allknowingroger_Ph3task1-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3task1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3task1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3task1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3task1-14B | c9a5bab157dbdd281c651a5b7ea82a8bc64aa420 | 30.284048 | apache-2.0 | 0 | 13 | true | false | false | false | 1.008848 | 0.469464 | 46.946435 | 0.631781 | 47.926908 | 0.151057 | 15.10574 | 0.350671 | 13.422819 | 0.450771 | 16.813021 | 0.473404 | 41.489362 | true | false | 2024-09-07 | 2024-10-08 | 1 | allknowingroger/Ph3task1-14B (Merge) |
allknowingroger_Ph3task2-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3task2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3task2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3task2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3task2-14B | 2193bfec75bc90e87bc57863e02deefbdd195f9f | 28.422289 | apache-2.0 | 0 | 13 | true | false | false | false | 0.935066 | 0.471313 | 47.131278 | 0.609841 | 44.081796 | 0.135196 | 13.519637 | 0.330537 | 10.738255 | 0.4535 | 16.620833 | 0.445977 | 38.441933 | true | false | 2024-09-08 | 2024-10-08 | 1 | allknowingroger/Ph3task2-14B (Merge) |
allknowingroger_Ph3task3-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3task3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3task3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3task3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3task3-14B | 359de5c4969057206f846a41c72073b3429317fd | 30.395519 | apache-2.0 | 0 | 13 | true | false | false | false | 1.037615 | 0.496242 | 49.624219 | 0.629792 | 47.998499 | 0.1571 | 15.70997 | 0.341443 | 12.192394 | 0.442552 | 14.952344 | 0.477061 | 41.895686 | true | false | 2024-09-08 | 2024-10-08 | 1 | allknowingroger/Ph3task3-14B (Merge) |
allknowingroger_Ph3unsloth-3B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3unsloth-3B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3unsloth-3B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3unsloth-3B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3unsloth-3B-slerp | 465444b3cdd43876717f7386ea2f3357c5fe8e53 | 19.763283 | apache-2.0 | 0 | 3 | true | false | false | false | 0.529042 | 0.189445 | 18.944512 | 0.546808 | 36.458773 | 0.077795 | 7.779456 | 0.324664 | 9.955257 | 0.452781 | 15.43099 | 0.370096 | 30.010712 | true | false | 2024-05-31 | 2024-06-26 | 1 | allknowingroger/Ph3unsloth-3B-slerp (Merge) |
allknowingroger_Phi3mash1-17B-pass_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Phi3mash1-17B-pass" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Phi3mash1-17B-pass</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Phi3mash1-17B-pass-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Phi3mash1-17B-pass | fcd265996f026475c15fa44833e0481dc610e469 | 21.349969 | apache-2.0 | 0 | 16 | true | false | false | false | 1.458126 | 0.188421 | 18.842117 | 0.612888 | 45.250419 | 0 | 0 | 0.319631 | 9.284116 | 0.445125 | 14.840625 | 0.458943 | 39.882535 | true | false | 2024-08-28 | 2024-09-02 | 1 | allknowingroger/Phi3mash1-17B-pass (Merge) |
allknowingroger_Quen2-65B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Quen2-65B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Quen2-65B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Quen2-65B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Quen2-65B | 2259cd8ea037d0e590920e7106b0fd1641a96c1d | 3.531344 | 0 | 63 | false | false | false | false | 13.317424 | 0.175781 | 17.578137 | 0.275652 | 1.23986 | 0 | 0 | 0.235738 | 0 | 0.320854 | 1.106771 | 0.11137 | 1.263298 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_Qwen2.5-42B-AGI_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-42B-AGI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-42B-AGI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-42B-AGI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-42B-AGI | 8939b021a9d84bc2e4ae0ea4f351d807f35b91d7 | 4.47083 | 0 | 42 | false | false | false | false | 8.856981 | 0.191294 | 19.129355 | 0.29421 | 2.235886 | 0 | 0 | 0.260067 | 1.342282 | 0.362031 | 2.253906 | 0.116772 | 1.863549 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_Qwen2.5-7B-task2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task2 | 6f3ae972b2bbde0383c3a774e0e788a1af0dabc5 | 29.110059 | apache-2.0 | 1 | 7 | true | false | false | false | 0.729921 | 0.452703 | 45.270327 | 0.562594 | 37.52855 | 0.308912 | 30.891239 | 0.316275 | 8.836689 | 0.436969 | 13.054427 | 0.451712 | 39.079122 | true | false | 2024-10-31 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task2 (Merge) |
allknowingroger_Qwen2.5-7B-task3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task3 | b1e524004242cdeec838ba21bce44ebb8598c12f | 27.895357 | apache-2.0 | 0 | 7 | true | false | false | false | 0.619608 | 0.512904 | 51.290354 | 0.539762 | 34.385984 | 0.20997 | 20.996979 | 0.317114 | 8.948546 | 0.435573 | 12.846615 | 0.450133 | 38.903664 | true | false | 2024-11-01 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task3 (Merge) |
allknowingroger_Qwen2.5-7B-task4_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task4 | ef4fe9331a0b9c34d829fcd5b1a09a7056e9300f | 29.331699 | apache-2.0 | 0 | 7 | true | false | false | false | 0.662099 | 0.500539 | 50.053857 | 0.558345 | 37.025269 | 0.267372 | 26.73716 | 0.32047 | 9.395973 | 0.439542 | 13.209375 | 0.456117 | 39.568558 | true | false | 2024-11-01 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task4 (Merge) |
allknowingroger_Qwen2.5-7B-task7_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task7 | 090a873c77ed291867ddaf20249ed7f479ba4ba9 | 23.274021 | apache-2.0 | 0 | 7 | true | false | false | false | 0.671654 | 0.4269 | 42.689952 | 0.555243 | 37.51817 | 0.021903 | 2.190332 | 0.32047 | 9.395973 | 0.432563 | 13.036979 | 0.413314 | 34.812722 | true | false | 2024-11-04 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task7 (Merge) |
allknowingroger_Qwen2.5-7B-task8_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task8 | 489a9a6fc98001026d9b96563d715cad43aabc8c | 29.240845 | apache-2.0 | 0 | 7 | true | false | false | false | 0.703752 | 0.464519 | 46.451859 | 0.55249 | 36.092737 | 0.300604 | 30.060423 | 0.32047 | 9.395973 | 0.451448 | 15.297656 | 0.443318 | 38.146424 | true | false | 2024-11-04 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task8 (Merge) |
allknowingroger_Qwen2.5-slerp-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-slerp-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-slerp-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-slerp-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-slerp-14B | a44b0ea8291b62785152c2fe6ab336f5da672d1e | 34.109403 | apache-2.0 | 0 | 14 | true | false | false | false | 2.366882 | 0.49282 | 49.282016 | 0.651242 | 49.789537 | 0.219033 | 21.903323 | 0.36745 | 15.659955 | 0.474396 | 19.366146 | 0.537899 | 48.655437 | true | false | 2024-10-17 | 2024-10-21 | 1 | allknowingroger/Qwen2.5-slerp-14B (Merge) |
allknowingroger_QwenSlerp12-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp12-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp12-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp12-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp12-7B | be6510452755c2c8e559333ecaf68dc6b37637d9 | 29.535855 | apache-2.0 | 1 | 7 | true | false | false | false | 0.683034 | 0.507558 | 50.755772 | 0.555645 | 36.411303 | 0.267372 | 26.73716 | 0.315436 | 8.724832 | 0.459479 | 16.134896 | 0.446061 | 38.451167 | true | false | 2024-11-18 | 2024-11-22 | 1 | allknowingroger/QwenSlerp12-7B (Merge) |
allknowingroger_QwenSlerp4-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp4-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp4-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp4-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp4-14B | 3a55f52f639fc380a829b7cace5be3c96fcad730 | 37.804283 | 1 | 14 | false | false | false | false | 1.879162 | 0.632754 | 63.275442 | 0.648325 | 49.38124 | 0.309668 | 30.966767 | 0.372483 | 16.331096 | 0.464969 | 17.58776 | 0.543551 | 49.283392 | false | false | 2024-11-27 | 2024-12-06 | 1 | allknowingroger/QwenSlerp4-14B (Merge) |
|
allknowingroger_QwenSlerp5-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp5-14B | f1eac24bb5338ae11951d38ba09ff71f4d319cc9 | 38.94285 | 1 | 14 | false | false | false | false | 1.861812 | 0.711939 | 71.193877 | 0.635657 | 47.38764 | 0.331571 | 33.1571 | 0.364933 | 15.324385 | 0.467542 | 17.809375 | 0.539063 | 48.784722 | false | false | 2024-11-27 | 2024-12-06 | 1 | allknowingroger/QwenSlerp5-14B (Merge) |
|
allknowingroger_QwenSlerp6-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp6-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp6-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp6-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp6-14B | eff132ab6d7f612b46c47b29966f8391cea7b407 | 39.018458 | apache-2.0 | 0 | 14 | true | false | false | false | 1.740827 | 0.686685 | 68.668466 | 0.638445 | 47.588317 | 0.34139 | 34.138973 | 0.373322 | 16.442953 | 0.468969 | 18.321094 | 0.540559 | 48.950946 | true | false | 2024-11-28 | 2024-12-06 | 1 | allknowingroger/QwenSlerp6-14B (Merge) |
allknowingroger_QwenStock1-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenStock1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenStock1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenStock1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenStock1-14B | 79daacc58a5ec97f297c4a99dbb31d19ae4c59ca | 36.760956 | apache-2.0 | 2 | 14 | true | false | false | false | 2.040274 | 0.563412 | 56.341175 | 0.652849 | 50.076293 | 0.293807 | 29.380665 | 0.376678 | 16.89038 | 0.472969 | 18.78776 | 0.541805 | 49.089465 | true | false | 2024-11-28 | 2024-12-06 | 1 | allknowingroger/QwenStock1-14B (Merge) |
allknowingroger_QwenStock2-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenStock2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenStock2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenStock2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenStock2-14B | 69fd5f98c812cfa26d8514349669158a93058bf7 | 36.933778 | apache-2.0 | 1 | 14 | true | false | false | false | 2.021044 | 0.556343 | 55.634273 | 0.656885 | 50.598276 | 0.299094 | 29.909366 | 0.379195 | 17.225951 | 0.475604 | 19.283854 | 0.540559 | 48.950946 | true | false | 2024-11-29 | 2024-12-06 | 1 | allknowingroger/QwenStock2-14B (Merge) |
allknowingroger_QwenStock3-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenStock3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenStock3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenStock3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenStock3-14B | 834fbf35e01efc44e4f2c8c372d7c1412754c0fa | 36.973076 | apache-2.0 | 1 | 14 | true | false | false | false | 2.149642 | 0.561513 | 56.151345 | 0.656532 | 50.576674 | 0.296828 | 29.682779 | 0.378356 | 17.114094 | 0.475573 | 19.113281 | 0.542803 | 49.200281 | true | false | 2024-11-29 | 2024-12-06 | 1 | allknowingroger/QwenStock3-14B (Merge) |
allknowingroger_Qwenslerp2-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp2-14B | 38e902c114b5640509a8615fc2a2546e07a5fb3f | 35.694241 | apache-2.0 | 1 | 14 | true | false | false | false | 2.264443 | 0.500714 | 50.071366 | 0.655488 | 50.303692 | 0.302115 | 30.21148 | 0.368289 | 15.771812 | 0.472938 | 18.883854 | 0.540309 | 48.923242 | true | false | 2024-10-19 | 2024-10-21 | 1 | allknowingroger/Qwenslerp2-14B (Merge) |
allknowingroger_Qwenslerp2-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp2-7B | 46fe65fc2567b2430fa421478d47134ffe55c8f8 | 30.420238 | apache-2.0 | 0 | 7 | true | false | false | false | 0.63658 | 0.52944 | 52.943966 | 0.560913 | 37.437245 | 0.318731 | 31.873112 | 0.312919 | 8.389262 | 0.435604 | 12.817188 | 0.451546 | 39.060653 | true | false | 2024-10-31 | 2024-11-04 | 1 | allknowingroger/Qwenslerp2-7B (Merge) |
allknowingroger_Qwenslerp3-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp3-14B | ac60a6c4e224e5b52c42bebfd0cf81f920befdef | 35.550712 | apache-2.0 | 1 | 14 | true | false | false | false | 2.219631 | 0.505235 | 50.5235 | 0.652084 | 49.809829 | 0.294562 | 29.456193 | 0.375 | 16.666667 | 0.467604 | 18.017187 | 0.539478 | 48.830895 | true | false | 2024-10-19 | 2024-10-21 | 1 | allknowingroger/Qwenslerp3-14B (Merge) |
allknowingroger_Qwenslerp3-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp3-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp3-7B | 0351c5f6207cafd15e10e6d8dfe61b50d1b2378b | 29.978168 | apache-2.0 | 0 | 7 | true | false | false | false | 0.609445 | 0.501837 | 50.183735 | 0.558016 | 37.153984 | 0.282477 | 28.247734 | 0.324664 | 9.955257 | 0.45151 | 14.972135 | 0.454205 | 39.356161 | true | false | 2024-10-31 | 2024-11-04 | 1 | allknowingroger/Qwenslerp3-7B (Merge) |
allknowingroger_ROGERphi-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/ROGERphi-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/ROGERphi-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__ROGERphi-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/ROGERphi-7B-slerp | a92f90ae5e4286daa2399df4951a3347aaf414e1 | 20.707471 | apache-2.0 | 0 | 7 | true | false | false | false | 0.616349 | 0.386133 | 38.613324 | 0.519558 | 32.819032 | 0.073263 | 7.326284 | 0.288591 | 5.145414 | 0.468531 | 17.533073 | 0.305269 | 22.807698 | true | false | 2024-03-20 | 2024-06-26 | 1 | allknowingroger/ROGERphi-7B-slerp (Merge) |
allknowingroger_RogerMerge-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/RogerMerge-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/RogerMerge-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__RogerMerge-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/RogerMerge-7B-slerp | 397f5c0b52a536c130982ca2a7c3056358bbdf92 | 19.605148 | apache-2.0 | 0 | 7 | true | false | false | false | 0.617858 | 0.393302 | 39.330199 | 0.516018 | 31.987166 | 0.067976 | 6.797583 | 0.280201 | 4.026846 | 0.431979 | 12.930729 | 0.303025 | 22.558363 | true | false | 2024-04-11 | 2024-06-26 | 1 | allknowingroger/RogerMerge-7B-slerp (Merge) |
allknowingroger_Rombos-LLM-V2.5-Qwen-42b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Rombos-LLM-V2.5-Qwen-42b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Rombos-LLM-V2.5-Qwen-42b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Rombos-LLM-V2.5-Qwen-42b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Rombos-LLM-V2.5-Qwen-42b | 977192ef80c5c904697f1d85d2eeab5db3947c65 | 4.559641 | 0 | 42 | false | false | false | false | 8.447883 | 0.187921 | 18.792138 | 0.296916 | 2.60764 | 0 | 0 | 0.262584 | 1.677852 | 0.363333 | 2.416667 | 0.116772 | 1.863549 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_Strangecoven-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Strangecoven-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Strangecoven-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Strangecoven-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Strangecoven-7B-slerp | 8bc9d8f972d15fdd3e02c602ef4f549493bf2208 | 20.299389 | apache-2.0 | 1 | 7 | true | false | false | false | 0.632469 | 0.374643 | 37.464261 | 0.536802 | 34.832235 | 0.075529 | 7.55287 | 0.28943 | 5.257271 | 0.419885 | 10.41901 | 0.336436 | 26.270686 | true | false | 2024-05-16 | 2024-06-26 | 1 | allknowingroger/Strangecoven-7B-slerp (Merge) |
allknowingroger_Weirdslerp2-25B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Weirdslerp2-25B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Weirdslerp2-25B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Weirdslerp2-25B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Weirdslerp2-25B | 4221341fe45e3ee6eaab27830b27d46bbbd5ea23 | 4.039649 | 0 | 25 | false | false | false | false | 1.707712 | 0.175407 | 17.540681 | 0.28737 | 1.565992 | 0 | 0 | 0.249161 | 0 | 0.352354 | 3.710937 | 0.112783 | 1.420287 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_WestlakeMaziyar-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/WestlakeMaziyar-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/WestlakeMaziyar-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__WestlakeMaziyar-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/WestlakeMaziyar-7B-slerp | 751534a844b0d439fe62f98bf8882fe9ab9872e0 | 22.208593 | apache-2.0 | 0 | 7 | true | false | false | false | 0.631671 | 0.483777 | 48.377749 | 0.524548 | 33.342811 | 0.067976 | 6.797583 | 0.303691 | 7.158837 | 0.447385 | 14.489844 | 0.307763 | 23.084737 | true | false | 2024-05-16 | 2024-06-26 | 1 | allknowingroger/WestlakeMaziyar-7B-slerp (Merge) |
allknowingroger_YamMaths-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/YamMaths-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/YamMaths-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__YamMaths-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/YamMaths-7B-slerp | bd4ac9d63ca88c80d34fa60ef5cbb56d60a39077 | 20.527131 | apache-2.0 | 1 | 7 | true | false | false | false | 0.612624 | 0.414809 | 41.480937 | 0.515585 | 32.133322 | 0.083837 | 8.383686 | 0.280201 | 4.026846 | 0.438365 | 13.46224 | 0.313082 | 23.675754 | true | false | 2024-06-02 | 2024-06-26 | 1 | allknowingroger/YamMaths-7B-slerp (Merge) |
allknowingroger_Yi-1.5-34B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yi-1.5-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yi-1.5-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yi-1.5-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yi-1.5-34B | fef96e380cb3aeecac8e2e53ad2c73a1187beb68 | 4.252733 | 0 | 34 | false | false | false | false | 7.022692 | 0.163916 | 16.391619 | 0.282725 | 1.339043 | 0 | 0 | 0.258389 | 1.118568 | 0.385656 | 5.607031 | 0.109541 | 1.060136 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_Yi-blossom-40B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yi-blossom-40B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yi-blossom-40B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yi-blossom-40B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yi-blossom-40B | d1bf1cf9339808193c5a56ef23fecdfd1012acfb | 5.827458 | 0 | 18 | false | false | false | false | 0.988457 | 0.200886 | 20.088587 | 0.321504 | 5.539183 | 0 | 0 | 0.274329 | 3.243848 | 0.38426 | 5.199219 | 0.108045 | 0.893913 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_Yibuddy-35B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yibuddy-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yibuddy-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yibuddy-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yibuddy-35B | 592e1e52b97ec88a80ba3b496c19f2498ada4ea3 | 27.930703 | apache-2.0 | 0 | 34 | true | false | false | false | 7.184353 | 0.423477 | 42.347748 | 0.591619 | 42.808242 | 0.135952 | 13.595166 | 0.355705 | 14.09396 | 0.450458 | 15.973958 | 0.448886 | 38.765145 | true | false | 2024-09-17 | 2024-10-08 | 1 | allknowingroger/Yibuddy-35B (Merge) |
allknowingroger_Yillama-40B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yillama-40B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yillama-40B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yillama-40B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yillama-40B | 65db687755e716481a218cac99d20619d78e41f7 | 8.311487 | 0 | 34 | false | false | false | false | 3.524066 | 0.169686 | 16.968643 | 0.406289 | 15.875797 | 0 | 0 | 0.282718 | 4.362416 | 0.350063 | 1.757812 | 0.198138 | 10.904255 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_Yislerp-34B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yislerp-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yislerp-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yislerp-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yislerp-34B | 131ad918edd652271510ee8dba63d3e7319df133 | 29.37375 | apache-2.0 | 0 | 34 | true | false | false | false | 2.988778 | 0.369197 | 36.919706 | 0.615872 | 45.981696 | 0.214502 | 21.450151 | 0.358221 | 14.42953 | 0.456625 | 15.778125 | 0.47515 | 41.683289 | true | false | 2024-09-16 | 2024-09-19 | 1 | allknowingroger/Yislerp-34B (Merge) |
allknowingroger_Yislerp2-34B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yislerp2-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yislerp2-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yislerp2-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yislerp2-34B | 3147cf866736b786347928b655c887e8b9c07bfc | 30.421277 | apache-2.0 | 0 | 34 | true | false | false | false | 6.035958 | 0.399947 | 39.994659 | 0.624577 | 47.202306 | 0.228852 | 22.885196 | 0.364094 | 15.212528 | 0.452969 | 15.854427 | 0.472407 | 41.378546 | true | false | 2024-09-17 | 2024-10-08 | 1 | allknowingroger/Yislerp2-34B (Merge) |
allknowingroger_Yunconglong-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yunconglong-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yunconglong-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yunconglong-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yunconglong-13B-slerp | dead687b7342d875bd8ac73bfcd34b88a2e5564c | 19.612692 | apache-2.0 | 0 | 12 | true | false | false | false | 0.799631 | 0.424177 | 42.417674 | 0.516581 | 32.140729 | 0.055136 | 5.513595 | 0.28104 | 4.138702 | 0.416073 | 10.842448 | 0.303607 | 22.623005 | true | false | 2024-05-25 | 2024-06-26 | 1 | allknowingroger/Yunconglong-13B-slerp (Merge) |
allknowingroger_limyClown-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/limyClown-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/limyClown-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__limyClown-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/limyClown-7B-slerp | 732a1ed0c2c7007297ad9d9797793073825f65ca | 19.703889 | apache-2.0 | 0 | 7 | true | false | false | false | 0.61005 | 0.401745 | 40.174515 | 0.514752 | 31.931466 | 0.068731 | 6.873112 | 0.28104 | 4.138702 | 0.429313 | 12.464063 | 0.303773 | 22.641475 | true | false | 2024-03-23 | 2024-06-26 | 1 | allknowingroger/limyClown-7B-slerp (Merge) |
allknowingroger_llama3-Jallabi-40B-s_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/llama3-Jallabi-40B-s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/llama3-Jallabi-40B-s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__llama3-Jallabi-40B-s-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/llama3-Jallabi-40B-s | a86d8cc3530fb466245b2cac55f25c28d0bd8c22 | 5.029702 | 0 | 18 | false | false | false | false | 0.979575 | 0.192068 | 19.206816 | 0.325224 | 5.957912 | 0 | 0 | 0.237416 | 0 | 0.374958 | 4.036458 | 0.108793 | 0.977024 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_llama3AnFeng-40B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/llama3AnFeng-40B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/llama3AnFeng-40B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__llama3AnFeng-40B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/llama3AnFeng-40B | 5995441962287970ffc98ad9b292e14420bf49ca | 9.237994 | 0 | 39 | false | false | false | false | 4.063468 | 0.174208 | 17.420777 | 0.379408 | 12.476996 | 0 | 0 | 0.306208 | 7.494407 | 0.394 | 7.15 | 0.197972 | 10.885786 | false | false | 2024-09-19 | 0 | Removed |
||
allura-org_MS-Meadowlark-22B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/MS-Meadowlark-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/MS-Meadowlark-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__MS-Meadowlark-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/MS-Meadowlark-22B | 6eb2f6bee66dbffa1b17397e75a7380ed4f9d0ac | 26.39303 | other | 9 | 22 | true | false | false | true | 2.895902 | 0.669699 | 66.969862 | 0.516258 | 30.29658 | 0.141239 | 14.123867 | 0.325503 | 10.067114 | 0.38426 | 5.532552 | 0.382314 | 31.368203 | true | false | 2024-10-18 | 2024-10-24 | 1 | allura-org/MS-Meadowlark-22B (Merge) |
allura-org_MoE-Girl-1BA-7BT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | OlmoeForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/MoE-Girl-1BA-7BT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/MoE-Girl-1BA-7BT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__MoE-Girl-1BA-7BT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/MoE-Girl-1BA-7BT | ecfac73ab9e7f2ee006d6a2ad9c8e86a85deab2b | 6.390211 | apache-2.0 | 13 | 6 | true | true | false | true | 3.201155 | 0.270503 | 27.050338 | 0.313918 | 4.842344 | 0.01435 | 1.435045 | 0.258389 | 1.118568 | 0.343552 | 1.477344 | 0.121759 | 2.417627 | false | false | 2024-10-08 | 2024-10-10 | 1 | allenai/OLMoE-1B-7B-0924 |
aloobun_Meta-Llama-3-7B-28Layers_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/aloobun/Meta-Llama-3-7B-28Layers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aloobun/Meta-Llama-3-7B-28Layers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aloobun__Meta-Llama-3-7B-28Layers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aloobun/Meta-Llama-3-7B-28Layers | 9822e6b8d4de0c0f2964d299f6fcef72385a0341 | 13.136516 | llama3 | 0 | 7 | true | false | false | false | 0.809358 | 0.196365 | 19.636453 | 0.44375 | 22.09653 | 0.013595 | 1.359517 | 0.294463 | 5.928412 | 0.358927 | 5.799219 | 0.315991 | 23.998966 | true | false | 2024-05-10 | 2024-06-26 | 1 | aloobun/Meta-Llama-3-7B-28Layers (Merge) |
aloobun_d-SmolLM2-360M_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/aloobun/d-SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aloobun/d-SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aloobun__d-SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aloobun/d-SmolLM2-360M | 2a1d82b4cbcdfdff3c2cfcd171435c5f01b8de43 | 6.007837 | apache-2.0 | 1 | 0 | true | false | false | false | 0.370123 | 0.209704 | 20.970359 | 0.319578 | 4.762821 | 0.002266 | 0.226586 | 0.253356 | 0.447427 | 0.398063 | 7.757813 | 0.116938 | 1.882018 | false | false | 2024-11-20 | 2024-11-26 | 0 | aloobun/d-SmolLM2-360M |
alpindale_WizardLM-2-8x22B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/alpindale/WizardLM-2-8x22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alpindale/WizardLM-2-8x22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/alpindale__WizardLM-2-8x22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | alpindale/WizardLM-2-8x22B | 087834da175523cffd66a7e19583725e798c1b4f | 32.983523 | apache-2.0 | 389 | 140 | true | false | false | false | 93.305222 | 0.527217 | 52.721667 | 0.637731 | 48.576168 | 0.245468 | 24.546828 | 0.381711 | 17.561521 | 0.438708 | 14.538542 | 0.459608 | 39.956413 | false | false | 2024-04-16 | 2024-06-28 | 0 | alpindale/WizardLM-2-8x22B |
alpindale_magnum-72b-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/alpindale/magnum-72b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alpindale/magnum-72b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/alpindale__magnum-72b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | alpindale/magnum-72b-v1 | fef27e0f235ae8858b84b765db773a2a954110dd | 42.576587 | other | 160 | 72 | true | false | false | true | 12.515123 | 0.760648 | 76.064841 | 0.698222 | 57.653185 | 0.376888 | 37.688822 | 0.39094 | 18.791946 | 0.448938 | 15.617187 | 0.546792 | 49.643543 | false | false | 2024-06-17 | 2024-07-25 | 2 | Qwen/Qwen2-72B |
altomek_YiSM-34B-0rn_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/altomek/YiSM-34B-0rn" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">altomek/YiSM-34B-0rn</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/altomek__YiSM-34B-0rn-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | altomek/YiSM-34B-0rn | 7a481c67cbdd5c846d6aaab5ef9f1eebfad812c2 | 30.474248 | apache-2.0 | 1 | 34 | true | false | false | true | 2.960624 | 0.428373 | 42.837338 | 0.614001 | 45.382927 | 0.225831 | 22.583082 | 0.371644 | 16.219239 | 0.445 | 14.758333 | 0.469581 | 41.064569 | true | false | 2024-05-26 | 2024-06-27 | 1 | altomek/YiSM-34B-0rn (Merge) |
amazon_MegaBeam-Mistral-7B-300k_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/amazon/MegaBeam-Mistral-7B-300k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazon/MegaBeam-Mistral-7B-300k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amazon__MegaBeam-Mistral-7B-300k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | amazon/MegaBeam-Mistral-7B-300k | 42572e5c9a0747b19af5c5c9962d122622f32295 | 17.072823 | apache-2.0 | 15 | 7 | true | false | false | true | 0.64961 | 0.520347 | 52.034712 | 0.422773 | 19.291806 | 0.024169 | 2.416918 | 0.27349 | 3.131991 | 0.398 | 8.35 | 0.254904 | 17.21151 | false | false | 2024-05-13 | 2024-10-07 | 0 | amazon/MegaBeam-Mistral-7B-300k |
amd_AMD-Llama-135m_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/amd/AMD-Llama-135m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amd/AMD-Llama-135m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amd__AMD-Llama-135m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | amd/AMD-Llama-135m | 8f9c39b5ed86d422ab332ed1ecf042fdaeb57903 | 4.759627 | apache-2.0 | 110 | 0 | true | false | false | false | 0.128719 | 0.184225 | 18.422452 | 0.297393 | 2.485495 | 0.005287 | 0.528701 | 0.252517 | 0.33557 | 0.377969 | 4.91276 | 0.116855 | 1.872784 | false | true | 2024-07-19 | 2024-09-29 | 0 | amd/AMD-Llama-135m |
amd_AMD-Llama-135m_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/amd/AMD-Llama-135m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amd/AMD-Llama-135m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amd__AMD-Llama-135m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | amd/AMD-Llama-135m | 8f9c39b5ed86d422ab332ed1ecf042fdaeb57903 | 5.191212 | apache-2.0 | 110 | 0 | true | false | false | false | 0.525707 | 0.191843 | 19.18432 | 0.296944 | 2.537953 | 0.005287 | 0.528701 | 0.258389 | 1.118568 | 0.384573 | 5.904948 | 0.116855 | 1.872784 | false | true | 2024-07-19 | 2024-10-01 | 0 | amd/AMD-Llama-135m |
anakin87_gemma-2b-orpo_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/anakin87/gemma-2b-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anakin87/gemma-2b-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anakin87__gemma-2b-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anakin87/gemma-2b-orpo | bf6bfe30c31c18620767ad60d0bff89343804230 | 7.184001 | other | 28 | 2 | true | false | false | true | 0.789927 | 0.247797 | 24.779696 | 0.342617 | 7.949445 | 0.01284 | 1.283988 | 0.261745 | 1.565996 | 0.37276 | 4.128385 | 0.130568 | 3.396498 | false | false | 2024-03-24 | 2024-07-06 | 1 | google/gemma-2b |
anthracite-org_magnum-v1-72b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v1-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v1-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v1-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v1-72b | f8f85021bace7e8250ed8559c5b78b8b34f0c4cc | 42.610448 | other | 160 | 72 | true | false | false | true | 12.891056 | 0.760648 | 76.064841 | 0.698222 | 57.653185 | 0.376888 | 37.688822 | 0.39094 | 18.791946 | 0.448938 | 15.617187 | 0.54862 | 49.846705 | false | false | 2024-06-17 | 2024-09-21 | 2 | Qwen/Qwen2-72B |
anthracite-org_magnum-v2-12b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v2-12b | 18.695117 | apache-2.0 | 80 | 12 | true | false | false | true | 1.648835 | 0.376166 | 37.616635 | 0.502086 | 28.785552 | 0.048338 | 4.833837 | 0.291107 | 5.480984 | 0.417906 | 11.371615 | 0.316739 | 24.082077 | false | false | 2024-08-03 | 2024-09-05 | 1 | mistralai/Mistral-Nemo-Base-2407 |
|
anthracite-org_magnum-v2-72b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v2-72b | c9c5826ef42b9fcc8a8e1079be574481cf0b6cc6 | 41.556286 | other | 33 | 72 | true | false | false | true | 12.134217 | 0.756027 | 75.602734 | 0.700508 | 57.854704 | 0.340634 | 34.063444 | 0.385906 | 18.120805 | 0.437188 | 14.181771 | 0.545628 | 49.514258 | false | false | 2024-08-18 | 2024-09-05 | 2 | Qwen/Qwen2-72B |
anthracite-org_magnum-v2.5-12b-kto_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v2.5-12b-kto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v2.5-12b-kto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v2.5-12b-kto | aee0374e5a43e950c9977b0004dede1c57be2999 | 18.882085 | apache-2.0 | 41 | 12 | true | false | false | true | 1.609063 | 0.386558 | 38.655767 | 0.507696 | 29.625059 | 0.046073 | 4.607251 | 0.293624 | 5.816555 | 0.408635 | 9.979427 | 0.321476 | 24.608452 | false | false | 2024-08-12 | 2024-08-29 | 2 | mistralai/Mistral-Nemo-Base-2407 |
anthracite-org_magnum-v3-27b-kto_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-27b-kto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-27b-kto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-27b-kto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-27b-kto | 96fbb750b3150e5fe9d6d2fcf757f49310d99a43 | 29.097906 | gemma | 11 | 27 | true | false | false | true | 3.937534 | 0.567483 | 56.748317 | 0.586041 | 41.160103 | 0.166918 | 16.691843 | 0.355705 | 14.09396 | 0.385469 | 9.916927 | 0.423787 | 35.976285 | false | false | 2024-09-06 | 2024-09-15 | 1 | anthracite-org/magnum-v3-27b-kto (Merge) |
anthracite-org_magnum-v3-34b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-34b | 3bcd8c3dbb93021a5ce22203c690a1a084cafb73 | 29.666081 | apache-2.0 | 28 | 34 | true | false | false | true | 6.18931 | 0.511529 | 51.152941 | 0.608783 | 44.327903 | 0.194864 | 19.486405 | 0.360738 | 14.765101 | 0.38724 | 6.571615 | 0.475233 | 41.692524 | false | false | 2024-08-22 | 2024-09-18 | 0 | anthracite-org/magnum-v3-34b |
anthracite-org_magnum-v3-9b-chatml_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-9b-chatml" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-9b-chatml</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-9b-chatml-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-9b-chatml | 96c2d023c56ef73be095ffbae8cedd7243ebca84 | 19.416 | gemma | 22 | 9 | true | false | false | false | 2.889965 | 0.127471 | 12.747067 | 0.542769 | 35.317875 | 0.064199 | 6.41994 | 0.345638 | 12.751678 | 0.443229 | 13.236979 | 0.424202 | 36.022459 | false | false | 2024-08-27 | 2024-09-18 | 1 | IntervitensInc/gemma-2-9b-chatml |
anthracite-org_magnum-v3-9b-customgemma2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-9b-customgemma2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-9b-customgemma2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-9b-customgemma2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-9b-customgemma2 | 9a7cd3d47434bed2bd80e34e45c74e413f8baaa8 | 19.137327 | gemma | 17 | 9 | true | false | false | false | 2.901825 | 0.127296 | 12.729558 | 0.534014 | 34.116783 | 0.067976 | 6.797583 | 0.328859 | 10.514541 | 0.456469 | 15.058594 | 0.420462 | 35.6069 | false | false | 2024-08-27 | 2024-09-18 | 1 | google/gemma-2-9b |
anthracite-org_magnum-v4-12b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-12b | 704f2ccfe662052e415499e56789dd88ec01a113 | 19.949136 | apache-2.0 | 36 | 12 | true | false | false | false | 1.699016 | 0.339296 | 33.92964 | 0.517669 | 30.503902 | 0.098187 | 9.818731 | 0.296141 | 6.152125 | 0.409281 | 10.360156 | 0.360372 | 28.93026 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-12b |
anthracite-org_magnum-v4-22b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-22b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-22b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-22b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-22b | e5239e71d2628269b453a832de98c1ecb79d2557 | 27.7159 | other | 15 | 22 | true | false | false | false | 1.65029 | 0.562862 | 56.286209 | 0.548612 | 35.549149 | 0.191843 | 19.18429 | 0.32802 | 10.402685 | 0.440781 | 13.43099 | 0.382979 | 31.44208 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-22b |
anthracite-org_magnum-v4-27b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-27b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-27b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-27b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-27b | 50a14716bdeb6a9376b9377df31ab1497864f3f9 | 26.330889 | gemma | 9 | 27 | true | false | false | false | 5.736354 | 0.345417 | 34.541683 | 0.58673 | 40.960384 | 0.161631 | 16.163142 | 0.369966 | 15.995526 | 0.43799 | 12.815365 | 0.437583 | 37.509235 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-27b |
anthracite-org_magnum-v4-9b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-9b | e9db6cb80f02ca2e2db4538ef59f7a30f69a849d | 23.773818 | gemma | 11 | 9 | true | false | false | false | 2.556326 | 0.350263 | 35.026286 | 0.533642 | 33.270404 | 0.129154 | 12.915408 | 0.347315 | 12.975391 | 0.451573 | 15.646615 | 0.395279 | 32.808806 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-9b |
apple_DCLM-7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | OpenLMModel | <a target="_blank" href="https://huggingface.co/apple/DCLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">apple/DCLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/apple__DCLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | apple/DCLM-7B | c85bfa168f999ce27e954808bc005a2748fda5c5 | 13.986977 | apple-ascl | 825 | 7 | true | false | false | false | 0.629956 | 0.217272 | 21.727239 | 0.423214 | 19.760935 | 0.029456 | 2.945619 | 0.315436 | 8.724832 | 0.392073 | 7.309115 | 0.311087 | 23.454122 | false | false | 2024-07-11 | 2024-08-16 | 0 | apple/DCLM-7B |
arcee-ai_Arcee-Nova_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Nova" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Nova</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Nova-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Nova | ec3bfe88b83f81481daa04b6789c1e0d32827dc5 | 43.902335 | other | 44 | 72 | true | false | false | true | 11.493294 | 0.790749 | 79.074855 | 0.694197 | 56.740988 | 0.429003 | 42.900302 | 0.385067 | 18.008949 | 0.456167 | 17.220833 | 0.545213 | 49.468085 | false | false | 2024-07-16 | 2024-09-19 | 0 | arcee-ai/Arcee-Nova |
arcee-ai_Arcee-Spark_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Spark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Spark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Spark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Spark | 3fe368ea5fd32bc4a8d1bcf42510416f7fa28668 | 25.536456 | apache-2.0 | 86 | 7 | true | false | false | true | 1.098533 | 0.562087 | 56.208748 | 0.548947 | 37.138522 | 0.123112 | 12.311178 | 0.307047 | 7.606264 | 0.402094 | 8.595052 | 0.382231 | 31.358969 | false | false | 2024-06-22 | 2024-06-26 | 0 | arcee-ai/Arcee-Spark |
arcee-ai_Arcee-Spark_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Spark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Spark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Spark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Spark | 3fe368ea5fd32bc4a8d1bcf42510416f7fa28668 | 25.443169 | apache-2.0 | 86 | 7 | true | false | false | true | 1.13604 | 0.571829 | 57.182941 | 0.548086 | 36.92439 | 0.114048 | 11.404834 | 0.306208 | 7.494407 | 0.40076 | 8.395052 | 0.381316 | 31.257388 | false | false | 2024-06-22 | 2024-06-26 | 0 | arcee-ai/Arcee-Spark |
arcee-ai_Llama-3.1-SuperNova-Lite_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Llama-3.1-SuperNova-Lite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Llama-3.1-SuperNova-Lite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Llama-3.1-SuperNova-Lite | 76246ca4448c1a11787daee0958b60ab27f17774 | 30.042407 | llama3 | 175 | 8 | true | false | false | true | 0.855993 | 0.801739 | 80.173938 | 0.515199 | 31.57234 | 0.173716 | 17.371601 | 0.306208 | 7.494407 | 0.416323 | 11.673698 | 0.387716 | 31.968454 | false | false | 2024-09-10 | 2024-09-17 | 2 | meta-llama/Meta-Llama-3.1-8B |
arcee-ai_Llama-Spark_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Llama-Spark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Llama-Spark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Llama-Spark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Llama-Spark | 6d74a617fbb17a1ada08528f2673c89f84fb062e | 24.922433 | llama3 | 25 | 8 | true | false | false | true | 0.830714 | 0.791073 | 79.107324 | 0.50535 | 29.770254 | 0.012085 | 1.208459 | 0.299497 | 6.599553 | 0.359333 | 2.616667 | 0.372091 | 30.232343 | false | false | 2024-07-26 | 2024-08-08 | 0 | arcee-ai/Llama-Spark |
arcee-ai_SuperNova-Medius_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/SuperNova-Medius" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/SuperNova-Medius</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__SuperNova-Medius-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/SuperNova-Medius | e34fafcac2801be1ae5c7eb744e191a08119f2af | 33.892471 | apache-2.0 | 192 | 14 | true | false | false | true | 5.867812 | 0.718358 | 71.83584 | 0.637728 | 48.005015 | 0.153323 | 15.332326 | 0.333054 | 11.073826 | 0.423271 | 12.275521 | 0.503491 | 44.832299 | false | false | 2024-10-02 | 2024-10-22 | 1 | arcee-ai/SuperNova-Medius (Merge) |
arcee-ai_Virtuoso-Small_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Virtuoso-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Virtuoso-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Virtuoso-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Virtuoso-Small | ca5dec1c6351ba6f2f0c59e609b94628a29c1459 | 39.428323 | apache-2.0 | 32 | 14 | true | false | false | true | 1.514314 | 0.793521 | 79.352119 | 0.651763 | 50.399846 | 0.3429 | 34.29003 | 0.336409 | 11.521253 | 0.433906 | 14.438281 | 0.519116 | 46.56841 | false | false | 2024-12-01 | 2024-12-03 | 1 | arcee-ai/Virtuoso-Small (Merge) |
arcee-ai_raspberry-3B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/raspberry-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/raspberry-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__raspberry-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/raspberry-3B | 66bf1346c060bbfe1f1b98cd22e7a26ada69cf70 | 15.538003 | other | 36 | 3 | true | false | false | true | 1.036527 | 0.315416 | 31.541643 | 0.426893 | 19.528234 | 0.084592 | 8.459215 | 0.277685 | 3.691275 | 0.412323 | 9.407031 | 0.285406 | 20.600621 | false | false | 2024-10-05 | 2024-10-07 | 1 | Qwen/Qwen2.5-3B |
argilla_notus-7b-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/argilla/notus-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notus-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla__notus-7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | argilla/notus-7b-v1 | 30172203a2d41cb487bf7e2b92a821080783b2c9 | 18.411321 | mit | 122 | 7 | true | false | false | true | 0.667908 | 0.508207 | 50.820711 | 0.451186 | 22.747112 | 0.027946 | 2.794562 | 0.28943 | 5.257271 | 0.336417 | 6.585417 | 0.300366 | 22.262855 | false | true | 2023-11-16 | 2024-06-27 | 2 | mistralai/Mistral-7B-v0.1 |
argilla_notux-8x7b-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/argilla/notux-8x7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notux-8x7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla__notux-8x7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | argilla/notux-8x7b-v1 | 0b29f9afcbae2ab4c5085638d8f5a7f6d44c6b17 | 24.428231 | apache-2.0 | 165 | 46 | true | true | false | true | 21.390845 | 0.542229 | 54.222906 | 0.53633 | 34.758062 | 0.096677 | 9.667674 | 0.308725 | 7.829978 | 0.417594 | 10.532552 | 0.366024 | 29.558215 | false | true | 2023-12-12 | 2024-06-12 | 2 | mistralai/Mixtral-8x7B-v0.1 |
argilla-warehouse_Llama-3.1-8B-MagPie-Ultra_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/argilla-warehouse/Llama-3.1-8B-MagPie-Ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla-warehouse/Llama-3.1-8B-MagPie-Ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla-warehouse__Llama-3.1-8B-MagPie-Ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | argilla-warehouse/Llama-3.1-8B-MagPie-Ultra | 1e12f20ca5db84f65a6db793a65100433aac0ac6 | 19.45876 | llama3.1 | 1 | 8 | true | false | false | true | 0.965677 | 0.575651 | 57.565149 | 0.461961 | 23.51631 | 0.053625 | 5.362538 | 0.266779 | 2.237136 | 0.35425 | 4.247917 | 0.314412 | 23.823508 | false | true | 2024-09-26 | 2024-09-30 | 1 | meta-llama/Llama-3.1-8B |
arisin_orca-platypus-13B-slerp_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arisin/orca-platypus-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arisin/orca-platypus-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arisin__orca-platypus-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arisin/orca-platypus-13B-slerp | 679c8aa21e7d0ba79584a4b5eb352ecf26bd7096 | 14.766731 | apache-2.0 | 0 | 13 | true | false | false | false | 0.939063 | 0.267181 | 26.718108 | 0.463062 | 24.403766 | 0.01435 | 1.435045 | 0.298658 | 6.487696 | 0.425313 | 11.864063 | 0.259225 | 17.691711 | true | false | 2024-11-23 | 2024-11-23 | 0 | arisin/orca-platypus-13B-slerp |
asharsha30_LLAMA_Harsha_8_B_ORDP_10k_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/asharsha30/LLAMA_Harsha_8_B_ORDP_10k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">asharsha30/LLAMA_Harsha_8_B_ORDP_10k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/asharsha30__LLAMA_Harsha_8_B_ORDP_10k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | asharsha30/LLAMA_Harsha_8_B_ORDP_10k | c9b04b40cd3915f0576659aafba86b85c22a7ee8 | 15.982187 | apache-2.0 | 0 | 8 | true | false | false | true | 0.741119 | 0.346391 | 34.639091 | 0.466871 | 25.725678 | 0.052115 | 5.21148 | 0.27349 | 3.131991 | 0.369656 | 7.073698 | 0.281001 | 20.111185 | false | false | 2024-12-01 | 2024-12-01 | 1 | asharsha30/LLAMA_Harsha_8_B_ORDP_10k (Merge) |
ashercn97_a1-v0.0.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ashercn97/a1-v0.0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ashercn97/a1-v0.0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ashercn97__a1-v0.0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ashercn97/a1-v0.0.1 | 83760a8e55b312880f57247c0d9a4a25a0f2e528 | 20.819704 | 0 | 7 | false | false | false | false | 1.082046 | 0.219844 | 21.984446 | 0.518812 | 32.755432 | 0.169184 | 16.918429 | 0.311242 | 8.165548 | 0.411979 | 9.930729 | 0.416473 | 35.163638 | false | false | 2024-11-28 | 2024-11-28 | 0 | ashercn97/a1-v0.0.1 |
|
ashercn97_a1-v002_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ashercn97/a1-v002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ashercn97/a1-v002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ashercn97__a1-v002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ashercn97/a1-v002 | f6a33c4c83b57b3a76bc1e79e714bdf05f249ed6 | 21.761274 | apache-2.0 | 0 | 7 | true | false | false | false | 1.03892 | 0.258463 | 25.84631 | 0.526114 | 33.526527 | 0.166918 | 16.691843 | 0.318792 | 9.17226 | 0.415917 | 10.05625 | 0.41747 | 35.274453 | false | false | 2024-11-29 | 2024-11-29 | 2 | Qwen/Qwen2.5-7B |
athirdpath_Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/athirdpath__Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit | 42eaee4de10302fec7c0c20ad96f527cfb0b10a3 | 20.905595 | apache-2.0 | 1 | 8 | true | false | false | false | 0.886527 | 0.452104 | 45.210375 | 0.493907 | 28.015909 | 0.098187 | 9.818731 | 0.291946 | 5.592841 | 0.386396 | 8.299479 | 0.356466 | 28.496232 | false | false | 2024-07-30 | 2024-08-01 | 1 | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1 |
automerger_YamshadowExperiment28-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/automerger/YamshadowExperiment28-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">automerger/YamshadowExperiment28-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/automerger__YamshadowExperiment28-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | automerger/YamshadowExperiment28-7B | 76972ed8aacba1fd14f78e6f8d347f087f8b6800 | 19.896858 | apache-2.0 | 23 | 7 | true | false | false | false | 0.584424 | 0.407016 | 40.701561 | 0.515003 | 31.980235 | 0.061934 | 6.193353 | 0.286913 | 4.9217 | 0.430615 | 12.69349 | 0.306017 | 22.89081 | true | false | 2024-03-18 | 2024-06-29 | 1 | automerger/YamshadowExperiment28-7B (Merge) |
Subsets and Splits