eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 3
values | Architecture
stringclasses 56
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 25
values | Hub ❤️
int64 0
5.93k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 441
values | Submission Date
stringclasses 185
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
zelk12_MT5-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen1-gemma-2-9B | 0291b776e80f38381788cd8f1fb2c3435ad891b5 | 31.897632 | 0 | 10.159 | false | false | false | true | 2.017253 | 0.78313 | 78.312987 | 0.611048 | 44.183335 | 0.068731 | 6.873112 | 0.347315 | 12.975391 | 0.420385 | 11.614844 | 0.436835 | 37.426123 | false | false | 2024-10-25 | 2024-10-31 | 1 | zelk12/MT5-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen2-gemma-2-9B | 3ee2822fcba6708bd9032b79249a2789e5996b6a | 32.600392 | 1 | 10.159 | false | false | false | true | 1.858381 | 0.796244 | 79.624397 | 0.610541 | 44.113215 | 0.103474 | 10.347432 | 0.35151 | 13.534676 | 0.416292 | 10.436458 | 0.437916 | 37.546173 | false | false | 2024-11-23 | 2024-11-23 | 1 | zelk12/MT5-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen3-gemma-2-9B | 4b3811c689fec5c9cc483bb1ed696734e5e88fcf | 32.801838 | 0 | 10.159 | false | false | false | true | 1.937333 | 0.78253 | 78.253035 | 0.609049 | 43.885913 | 0.115559 | 11.555891 | 0.35151 | 13.534676 | 0.423052 | 12.08151 | 0.4375 | 37.5 | false | false | 2024-12-08 | 2024-12-08 | 1 | zelk12/MT5-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen4-gemma-2-9B | 2f826d76460a5b7f150622a57f2d5419adfc464f | 33.765135 | gemma | 0 | 5.08 | true | false | false | true | 1.82172 | 0.783455 | 78.345457 | 0.613106 | 44.323211 | 0.170695 | 17.069486 | 0.353188 | 13.758389 | 0.422833 | 11.354167 | 0.439661 | 37.7401 | true | false | 2024-12-20 | 2024-12-20 | 1 | zelk12/MT5-Gen4-gemma-2-9B (Merge) |
zelk12_MT5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-gemma-2-9B | b627ae7d796b1ae85b59c55e0e043b8d3ae73d83 | 32.595305 | 0 | 10.159 | false | false | false | true | 3.26983 | 0.804787 | 80.478685 | 0.611223 | 44.271257 | 0.095166 | 9.516616 | 0.343121 | 12.416107 | 0.420385 | 11.48151 | 0.436669 | 37.407654 | false | false | 2024-10-19 | 2024-10-21 | 1 | zelk12/MT5-gemma-2-9B (Merge) |
|
zelk12_gemma-2-S2MTM-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/gemma-2-S2MTM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/gemma-2-S2MTM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__gemma-2-S2MTM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/gemma-2-S2MTM-9B | fd6860743943114eeca6fc2e800e27c87873bcc5 | 31.148621 | gemma | 0 | 9 | true | false | false | true | 1.765103 | 0.782256 | 78.225553 | 0.606084 | 43.115728 | 0.04003 | 4.003021 | 0.345638 | 12.751678 | 0.421844 | 12.163802 | 0.429688 | 36.631944 | true | false | 2024-12-11 | 2024-12-11 | 1 | zelk12/gemma-2-S2MTM-9B (Merge) |
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 | b4208ddf6c741884c16c77b9433d9ead8f216354 | 30.344893 | 2 | 10.159 | false | false | false | true | 3.443191 | 0.764895 | 76.489492 | 0.607451 | 43.706516 | 0.013595 | 1.359517 | 0.349832 | 13.310962 | 0.413625 | 10.303125 | 0.432098 | 36.899749 | false | false | 2024-10-03 | 2024-10-03 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 | e652c9e07265526851dad994f4640aa265b9ab56 | 33.300246 | 1 | 10.159 | false | false | false | true | 3.194991 | 0.770665 | 77.066517 | 0.607543 | 43.85035 | 0.155589 | 15.558912 | 0.343121 | 12.416107 | 0.43226 | 13.132552 | 0.439993 | 37.777039 | false | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 | eb0e589291630ba20328db650f74af949d217a97 | 28.421762 | 0 | 10.159 | false | false | false | true | 3.751453 | 0.720806 | 72.080635 | 0.59952 | 42.487153 | 0 | 0 | 0.349832 | 13.310962 | 0.395115 | 7.75599 | 0.414063 | 34.895833 | false | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 | 76f56b25bf6d8704282f8c77bfda28ca384883bc | 30.113979 | 1 | 10.159 | false | false | false | true | 3.413675 | 0.759999 | 75.999902 | 0.606626 | 43.633588 | 0.012085 | 1.208459 | 0.348154 | 13.087248 | 0.410958 | 9.836458 | 0.432264 | 36.918218 | false | false | 2024-10-07 | 2024-10-11 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge) |
|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 | 1e3e623e9f0b386bfd967c629dd39c87daef5bed | 31.626376 | 1 | 10.159 | false | false | false | true | 6.461752 | 0.761523 | 76.152276 | 0.609878 | 43.941258 | 0.073263 | 7.326284 | 0.341443 | 12.192394 | 0.431021 | 13.310937 | 0.431516 | 36.835106 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 | 8af6620b39c9a36239879b6b2bd88f66e9e9d930 | 32.254423 | 0 | 10.159 | false | false | false | true | 6.542869 | 0.794396 | 79.439554 | 0.60644 | 43.39057 | 0.09139 | 9.138973 | 0.35151 | 13.534676 | 0.420229 | 11.095313 | 0.432347 | 36.927453 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 | ced039b03be6f65ac0f713efcee76c6534e65639 | 32.448061 | 0 | 10.159 | false | false | false | true | 3.13222 | 0.744537 | 74.453672 | 0.597759 | 42.132683 | 0.180514 | 18.05136 | 0.34396 | 12.527964 | 0.429469 | 12.183594 | 0.418052 | 35.339096 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge) |
|
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-72B-Instruct-abliterated | af94b3c05c9857dbac73afb1cbce00e4833ec9ef | 45.293139 | other | 15 | 72.706 | true | false | false | false | 18.809182 | 0.715261 | 71.526106 | 0.715226 | 59.912976 | 0.46148 | 46.148036 | 0.406879 | 20.917226 | 0.471917 | 19.122917 | 0.587184 | 54.131575 | false | false | 2024-10-01 | 2024-11-08 | 2 | Qwen/Qwen2.5-72B |
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zhengr/MixTAO-7Bx2-MoE-v8.1 | 828e963abf2db0f5af9ed0d4034e538fc1cf5f40 | 17.168311 | apache-2.0 | 55 | 12.879 | true | true | false | true | 0.92739 | 0.418781 | 41.878106 | 0.420194 | 19.176907 | 0.066465 | 6.646526 | 0.298658 | 6.487696 | 0.397625 | 8.303125 | 0.284658 | 20.517509 | false | false | 2024-02-26 | 2024-06-27 | 0 | zhengr/MixTAO-7Bx2-MoE-v8.1 |