eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pankajmathur_orca_mini_v2_7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v2_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v2_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v2_7b | 66d3f32a4a6bca0a2a261f1bdb54d2582028f75f | 5.502369 | cc-by-nc-sa-4.0 | 36 | 7 | true | false | false | false | 0.592511 | 0.135789 | 13.57886 | 0.353634 | 10.199953 | 0.011329 | 1.132931 | 0.249161 | 0 | 0.359333 | 2.083333 | 0.154172 | 6.019134 | false | false | 2023-07-03 | 2024-06-26 | 0 | pankajmathur/orca_mini_v2_7b |
pankajmathur_orca_mini_v3_13b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v3_13b | 7d6e567d24ce2f228beaf54e89c17b0e750bfe99 | 15.016121 | other | 31 | 13 | true | false | false | false | 1.097179 | 0.289663 | 28.966254 | 0.471097 | 25.549482 | 0.019637 | 1.963746 | 0.265101 | 2.013423 | 0.459792 | 17.107292 | 0.230469 | 14.496528 | false | false | 2023-08-09 | 2024-06-26 | 0 | pankajmathur/orca_mini_v3_13b |
pankajmathur_orca_mini_v3_70b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v3_70b | e8e856dfb5c737d1906b50f9e65fd3a4f8d77422 | 25.298159 | other | 23 | 70 | true | false | false | false | 6.406537 | 0.40147 | 40.147032 | 0.594931 | 42.975787 | 0.03852 | 3.851964 | 0.317953 | 9.060403 | 0.507854 | 25.115104 | 0.375748 | 30.638667 | false | false | 2023-08-10 | 2024-06-26 | 0 | pankajmathur/orca_mini_v3_70b |
pankajmathur_orca_mini_v3_7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v3_7b | 6252eb7ca29da8d951ae7d2bca948bf84e04a2b9 | 13.51814 | other | 40 | 7 | true | false | false | false | 0.63995 | 0.282094 | 28.209373 | 0.409533 | 17.843956 | 0.003021 | 0.302115 | 0.246644 | 0 | 0.49824 | 22.713281 | 0.208361 | 12.040115 | false | false | 2023-08-07 | 2024-06-26 | 0 | pankajmathur/orca_mini_v3_7b |
pankajmathur_orca_mini_v5_8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v5_8b | f57c84d4cc0b3b74549458c0d38e868bd7fffad1 | 20.246538 | llama3 | 2 | 8 | true | false | false | false | 0.878971 | 0.480605 | 48.06048 | 0.506424 | 29.345795 | 0.083837 | 8.383686 | 0.286913 | 4.9217 | 0.40001 | 7.701302 | 0.307596 | 23.066268 | false | false | 2024-05-26 | 2024-06-26 | 0 | pankajmathur/orca_mini_v5_8b |
pankajmathur_orca_mini_v5_8b_dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v5_8b_dpo | fdc0d0aaa85a58f1abaf2c24ce0ddca10c08f0f1 | 20.057268 | llama3 | 2 | 8 | true | false | false | false | 0.81669 | 0.489647 | 48.964747 | 0.50746 | 29.605373 | 0.080816 | 8.081571 | 0.274329 | 3.243848 | 0.389375 | 6.938542 | 0.311586 | 23.50953 | false | false | 2024-05-30 | 2024-06-26 | 0 | pankajmathur/orca_mini_v5_8b_dpo |
pankajmathur_orca_mini_v5_8b_orpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v5_8b_orpo | 4cdc018043ef439f15bd8a09c4f09c6bc528dfc7 | 12.968554 | llama3 | 1 | 8 | true | false | false | false | 0.97185 | 0.082432 | 8.243239 | 0.496374 | 27.877628 | 0.064955 | 6.495468 | 0.284396 | 4.58613 | 0.413125 | 8.973958 | 0.294714 | 21.6349 | false | false | 2024-05-31 | 2024-06-26 | 0 | pankajmathur/orca_mini_v5_8b_orpo |
pankajmathur_orca_mini_v6_8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v6_8b | e95dc8e4c6b6ca5957b657cc2d905683142eaf3e | 1.413398 | llama3 | 1 | 8 | true | false | false | true | 1.216368 | 0.011116 | 1.111606 | 0.30287 | 3.21981 | 0 | 0 | 0.238255 | 0 | 0.355458 | 2.765625 | 0.11245 | 1.383348 | false | false | 2024-06-02 | 2024-06-26 | 0 | pankajmathur/orca_mini_v6_8b |
pankajmathur_orca_mini_v6_8b_dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v6_8b_dpo | ebb11b63839d38e8c03c7ecac012e047fcb2346e | 20.392492 | llama3 | 1 | 8 | true | false | false | false | 0.769824 | 0.388256 | 38.825649 | 0.520281 | 32.478826 | 0.061178 | 6.117825 | 0.301174 | 6.823266 | 0.409031 | 9.26224 | 0.359624 | 28.847148 | false | false | 2024-06-21 | 2024-06-26 | 0 | pankajmathur/orca_mini_v6_8b_dpo |
pankajmathur_orca_mini_v7_72b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v7_72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v7_72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v7_72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v7_72b | 447f11912cfa496e32e188a55214043a05760d3a | 39.387496 | apache-2.0 | 11 | 72 | true | false | false | false | 14.051707 | 0.592962 | 59.296223 | 0.68423 | 55.055523 | 0.283988 | 28.398792 | 0.385067 | 18.008949 | 0.507042 | 24.213542 | 0.562168 | 51.35195 | false | false | 2024-06-26 | 2024-06-26 | 0 | pankajmathur/orca_mini_v7_72b |
pankajmathur_orca_mini_v7_7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v7_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v7_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v7_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v7_7b | f5e84ff6ea25fb4585908ea45d1520bac416d803 | 22.425578 | apache-2.0 | 2 | 7 | true | false | false | false | 0.925109 | 0.438765 | 43.87647 | 0.527491 | 33.950434 | 0.02719 | 2.719033 | 0.296141 | 6.152125 | 0.435979 | 12.664063 | 0.416722 | 35.191342 | false | false | 2024-06-20 | 2024-06-26 | 0 | pankajmathur/orca_mini_v7_7b |
paulml_ECE-ILAB-Q1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/paulml/ECE-ILAB-Q1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/ECE-ILAB-Q1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paulml__ECE-ILAB-Q1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | paulml/ECE-ILAB-Q1 | 393bea0ee85e4c752acd5fd77ce07f577fc13bd9 | 41.307201 | other | 0 | 72 | true | false | false | false | 11.415142 | 0.786452 | 78.645217 | 0.671776 | 53.702228 | 0.283988 | 28.398792 | 0.386745 | 18.232662 | 0.461375 | 18.805208 | 0.550532 | 50.059102 | true | false | 2024-06-06 | 2024-09-16 | 0 | paulml/ECE-ILAB-Q1 |
pints-ai_1.5-Pints-16K-v0.1_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pints-ai/1.5-Pints-16K-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pints-ai/1.5-Pints-16K-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pints-ai__1.5-Pints-16K-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pints-ai/1.5-Pints-16K-v0.1 | 7862a52f250be68fad593f3a4030f00d658ede56 | 4.150223 | mit | 14 | 1 | true | false | false | true | 0.279938 | 0.163591 | 16.359149 | 0.313308 | 3.658292 | 0.008308 | 0.830816 | 0.235738 | 0 | 0.357875 | 2.734375 | 0.111868 | 1.318706 | false | false | 2024-08-07 | 2024-09-09 | 0 | pints-ai/1.5-Pints-16K-v0.1 |
pints-ai_1.5-Pints-2K-v0.1_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pints-ai/1.5-Pints-2K-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pints-ai/1.5-Pints-2K-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pints-ai__1.5-Pints-2K-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pints-ai/1.5-Pints-2K-v0.1 | 2e865c18669161ebbf5e9ad79ae0502ee0153df0 | 3.830442 | mit | 16 | 1 | true | false | false | true | 0.291417 | 0.176156 | 17.615593 | 0.298019 | 2.37447 | 0 | 0 | 0.248322 | 0 | 0.350187 | 1.840104 | 0.110372 | 1.152482 | false | false | 2024-08-07 | 2024-09-09 | 0 | pints-ai/1.5-Pints-2K-v0.1 |
piotr25691_thea-3b-25r_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/piotr25691/thea-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | piotr25691/thea-3b-25r | 4661fb3c8b18bdf2059f703c4f69caea24057151 | 24.021247 | llama3.2 | 1 | 3 | true | false | false | true | 0.690508 | 0.73442 | 73.442023 | 0.448441 | 22.546711 | 0.179758 | 17.975831 | 0.267617 | 2.348993 | 0.331458 | 3.565625 | 0.318235 | 24.248301 | false | false | 2024-10-11 | 2024-10-12 | 1 | chuanli11/Llama-3.2-3B-Instruct-uncensored |
piotr25691_thea-c-3b-25r_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/piotr25691/thea-c-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-c-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-c-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | piotr25691/thea-c-3b-25r | 93a2333a84feda26f020bc8fa92f870462dacd89 | 23.179267 | llama3.2 | 1 | 3 | true | false | false | true | 0.662456 | 0.74019 | 74.019047 | 0.453241 | 22.76785 | 0.148036 | 14.803625 | 0.265101 | 2.013423 | 0.33149 | 1.269531 | 0.317819 | 24.202128 | false | false | 2024-10-14 | 2024-10-17 | 1 | meta-llama/Llama-3.2-3B-Instruct |
piotr25691_thea-rp-3b-25r_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/piotr25691/thea-rp-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-rp-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-rp-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | piotr25691/thea-rp-3b-25r | ed4c338e07356f1657cf4d08b768ff866bbf0a68 | 21.732089 | llama3.2 | 0 | 3 | true | false | false | true | 0.658453 | 0.657784 | 65.778357 | 0.439029 | 20.007381 | 0.125378 | 12.537764 | 0.274329 | 3.243848 | 0.381875 | 5.934375 | 0.306017 | 22.89081 | false | false | 2024-10-13 | 2024-10-16 | 1 | SicariusSicariiStuff/Impish_LLAMA_3B |
postbot_gpt2-medium-emailgen_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | GPT2LMHeadModel | <a target="_blank" href="https://huggingface.co/postbot/gpt2-medium-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/gpt2-medium-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/postbot__gpt2-medium-emailgen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | postbot/gpt2-medium-emailgen | a0299eb6760126e3bd04d2f10cd166c4563f82d2 | 4.743048 | apache-2.0 | 6 | 0 | true | false | false | false | 0.078186 | 0.149203 | 14.9203 | 0.313043 | 3.6737 | 0 | 0 | 0.260067 | 1.342282 | 0.391115 | 6.889323 | 0.114694 | 1.632683 | false | false | 2022-09-29 | 2024-11-17 | 0 | postbot/gpt2-medium-emailgen |
prince-canuma_Ministral-8B-Instruct-2410-HF_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/prince-canuma/Ministral-8B-Instruct-2410-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Ministral-8B-Instruct-2410-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prince-canuma__Ministral-8B-Instruct-2410-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prince-canuma/Ministral-8B-Instruct-2410-HF | e0a14d7a6a8a1d1e5bef1a77a42e86e8bcae0ee7 | 21.680297 | other | 10 | 8 | true | false | false | true | 1.016935 | 0.591164 | 59.116367 | 0.458561 | 23.778465 | 0.067976 | 6.797583 | 0.28104 | 4.138702 | 0.41375 | 10.71875 | 0.329787 | 25.531915 | false | false | 2024-10-16 | 2024-10-17 | 1 | prince-canuma/Ministral-8B-Instruct-2410-HF (Merge) |
princeton-nlp_Llama-3-8B-ProLong-512k-Base_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-8B-ProLong-512k-Base | 51a333f7c99f5052377154b76909dfe63ff7ab83 | 21.679045 | llama3 | 7 | 8 | true | false | false | true | 0.878664 | 0.532212 | 53.221231 | 0.503321 | 29.847246 | 0.068731 | 6.873112 | 0.261745 | 1.565996 | 0.422271 | 12.683854 | 0.332945 | 25.882831 | false | false | 2024-08-22 | 2024-10-16 | 1 | princeton-nlp/Llama-3-8B-ProLong-512k-Base (Merge) |
princeton-nlp_Llama-3-8B-ProLong-512k-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | eae0626e8597575215276c2b248720f731bc50b8 | 21.942344 | llama3 | 17 | 8 | true | false | false | true | 2.344706 | 0.550822 | 55.082182 | 0.502831 | 29.151153 | 0.05287 | 5.287009 | 0.286074 | 4.809843 | 0.426646 | 12.530729 | 0.323138 | 24.793144 | false | false | 2024-08-22 | 2024-11-16 | 1 | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct (Merge) |
princeton-nlp_Llama-3-8B-ProLong-512k-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct | bf92e493b7b0ef1db0242bfa97f1d8f92be02e9c | 19.317531 | llama3 | 17 | 8 | true | false | false | false | 0.724374 | 0.397773 | 39.777346 | 0.498303 | 28.669219 | 0.062689 | 6.268882 | 0.28104 | 4.138702 | 0.425 | 12.091667 | 0.324634 | 24.959368 | false | false | 2024-08-22 | 2024-11-16 | 1 | princeton-nlp/Llama-3-8B-ProLong-512k-Instruct (Merge) |
princeton-nlp_Llama-3-8B-ProLong-64k-Base_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-64k-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-64k-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-64k-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-8B-ProLong-64k-Base | 97994d6918f80162a893e22d5e7bba586551f941 | 21.601846 | llama3 | 5 | 8 | true | false | false | true | 1.822461 | 0.520072 | 52.00723 | 0.492713 | 28.687899 | 0.061934 | 6.193353 | 0.265101 | 2.013423 | 0.434052 | 14.623177 | 0.334774 | 26.085993 | false | false | 2024-07-22 | 2024-10-16 | 1 | princeton-nlp/Llama-3-8B-ProLong-64k-Base (Merge) |
princeton-nlp_Llama-3-8B-ProLong-64k-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-64k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-64k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-64k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-8B-ProLong-64k-Instruct | fe55aed18544c5744239e473bb0d3aa0151776d3 | 22.970639 | llama3 | 13 | 8 | true | false | false | true | 1.656687 | 0.556317 | 55.631724 | 0.508304 | 30.089572 | 0.061934 | 6.193353 | 0.295302 | 6.040268 | 0.439698 | 14.595573 | 0.32746 | 25.273345 | false | false | 2024-07-21 | 2024-10-16 | 1 | princeton-nlp/Llama-3-8B-ProLong-64k-Instruct (Merge) |
princeton-nlp_Llama-3-Base-8B-SFT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT | b622b7d814aa03aa722328bf88feaf1ad480b7fb | 15.437201 | 1 | 8 | true | false | false | true | 1.788286 | 0.309646 | 30.964618 | 0.454784 | 24.388576 | 0.030967 | 3.096677 | 0.283557 | 4.474273 | 0.394958 | 8.036458 | 0.294963 | 21.662603 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT |
|
princeton-nlp_Llama-3-Base-8B-SFT-CPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-CPO | 536ce7e7beb35175c48538fe46e7e9e100f228c9 | 15.878261 | 0 | 8 | true | false | false | true | 0.967846 | 0.370346 | 37.034624 | 0.459488 | 25.474649 | 0.049849 | 4.984894 | 0.274329 | 3.243848 | 0.360854 | 2.573438 | 0.297623 | 21.958112 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-CPO |
|
princeton-nlp_Llama-3-Base-8B-SFT-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-DPO | 3f5ec47c9beffb37cfbdcd837e76a336a9b1e651 | 18.162221 | 0 | 8 | true | false | false | true | 0.92634 | 0.411113 | 41.111251 | 0.466585 | 26.001874 | 0.028701 | 2.870091 | 0.310403 | 8.053691 | 0.38674 | 7.842448 | 0.307846 | 23.093972 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-DPO |
|
princeton-nlp_Llama-3-Base-8B-SFT-IPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-IPO | 85055cc4b9c707e0bd1239d20d1f62927a7a54c3 | 18.281889 | 0 | 8 | true | false | false | true | 0.932191 | 0.448656 | 44.865623 | 0.469007 | 25.705433 | 0.01284 | 1.283988 | 0.297819 | 6.375839 | 0.391948 | 7.960156 | 0.311503 | 23.500296 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-IPO |
|
princeton-nlp_Llama-3-Base-8B-SFT-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-KTO | 49a8c2e5ccc7a28ed7bbedf093e352015fc1eb9b | 17.964858 | 0 | 8 | true | false | false | true | 0.861852 | 0.452253 | 45.225335 | 0.469285 | 25.55523 | 0.012085 | 1.208459 | 0.305369 | 7.38255 | 0.384198 | 5.591406 | 0.305436 | 22.826167 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-KTO |
|
princeton-nlp_Llama-3-Base-8B-SFT-ORPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-ORPO | 54d58402e0168faff6503e41621ad6c8274a310a | 19.192797 | 0 | 8 | true | false | false | true | 0.906563 | 0.451654 | 45.165383 | 0.473406 | 26.485894 | 0.042296 | 4.229607 | 0.313758 | 8.501119 | 0.370677 | 7.634635 | 0.308261 | 23.140145 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-ORPO |
|
princeton-nlp_Llama-3-Base-8B-SFT-RDPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-RDPO | b41a964c2135ba34dcc6fa7edf76b6b9ea656949 | 18.815011 | 0 | 8 | true | false | false | true | 0.902435 | 0.448007 | 44.800684 | 0.466201 | 25.526521 | 0.037764 | 3.776435 | 0.306208 | 7.494407 | 0.40274 | 8.909115 | 0.301446 | 22.382905 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-RDPO |
|
princeton-nlp_Llama-3-Base-8B-SFT-RRHF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-RRHF | aea8c04b3940cebd1f8296a2c76914f0ce70c276 | 16.081314 | 0 | 8 | true | false | false | true | 0.951469 | 0.335725 | 33.572477 | 0.452036 | 23.659142 | 0.033233 | 3.323263 | 0.305369 | 7.38255 | 0.372229 | 7.561979 | 0.288896 | 20.988475 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-RRHF |
|
princeton-nlp_Llama-3-Base-8B-SFT-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF | 325092c1eddffc3ca7157be1ff9958128e5753ef | 19.730525 | 0 | 8 | true | false | false | true | 0.960471 | 0.489048 | 48.904795 | 0.470408 | 26.373963 | 0.049849 | 4.984894 | 0.286913 | 4.9217 | 0.409094 | 10.270052 | 0.30635 | 22.927748 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF |
|
princeton-nlp_Llama-3-Base-8B-SFT-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Base-8B-SFT-SimPO | 0a6e518b13b67abe8433bce3f7beee9beb74a794 | 19.279456 | 0 | 8 | false | false | false | true | 0.861565 | 0.46854 | 46.854014 | 0.474125 | 26.39595 | 0.020393 | 2.039275 | 0.288591 | 5.145414 | 0.412688 | 11.852604 | 0.310505 | 23.38948 | false | false | 2024-05-24 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Base-8B-SFT-SimPO |
|
princeton-nlp_Llama-3-Instruct-8B-CPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-CPO | d4645ae4c3b99892f1c59f60a77330be35567835 | 23.91096 | 0 | 8 | true | false | false | true | 0.739416 | 0.729299 | 72.929937 | 0.499879 | 28.604299 | 0.093656 | 9.365559 | 0.260067 | 1.342282 | 0.351396 | 1.757812 | 0.365193 | 29.465869 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-CPO |
|
princeton-nlp_Llama-3-Instruct-8B-CPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-CPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2 | 5ed83728712693437bd547f4cd32923ac4e1172d | 24.821014 | 0 | 8 | true | false | false | true | 0.772886 | 0.750582 | 75.058179 | 0.502667 | 29.086407 | 0.10423 | 10.422961 | 0.260906 | 1.454139 | 0.361906 | 2.838281 | 0.370595 | 30.06612 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-DPO | 0afbf4c012ec7507f61c554999151b95a3651db3 | 22.667424 | 0 | 8 | true | false | false | true | 0.564854 | 0.675744 | 67.574369 | 0.49913 | 28.507392 | 0.034743 | 3.47432 | 0.271812 | 2.908277 | 0.373813 | 3.926563 | 0.366523 | 29.613623 | false | false | 2024-05-17 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-DPO |
|
princeton-nlp_Llama-3-Instruct-8B-DPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-DPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2 | d06275e02abbeaf29d911a3c0cf22922dcca6b0b | 24.718027 | 0 | 8 | true | false | false | true | 0.60288 | 0.720806 | 72.080635 | 0.50562 | 28.939587 | 0.060423 | 6.042296 | 0.286913 | 4.9217 | 0.384448 | 5.55599 | 0.376912 | 30.767952 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-KTO | e697908201cbab01e0ca54088bb8cd2fd99b4574 | 22.789641 | 0 | 8 | true | false | false | true | 0.602517 | 0.68641 | 68.640984 | 0.49819 | 28.649658 | 0.034743 | 3.47432 | 0.276007 | 3.467562 | 0.369844 | 3.630469 | 0.359874 | 28.874852 | false | false | 2024-05-17 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-KTO |
|
princeton-nlp_Llama-3-Instruct-8B-KTO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2 | 477d33ea62ed57a0429517170612aa1df21c78d6 | 24.344687 | 0 | 8 | true | false | false | true | 0.630536 | 0.729025 | 72.902454 | 0.507977 | 29.648406 | 0.080816 | 8.081571 | 0.260067 | 1.342282 | 0.37775 | 4.452083 | 0.366772 | 29.641327 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-ORPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-ORPO | 4bb3ffcf9ede48cb01a10bf3223eb41b59aa3fef | 23.534475 | 0 | 8 | true | false | false | true | 0.623904 | 0.712813 | 71.281311 | 0.500121 | 28.839356 | 0.073263 | 7.326284 | 0.258389 | 1.118568 | 0.350188 | 3.240104 | 0.364611 | 29.401226 | false | false | 2024-05-17 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-ORPO |
|
princeton-nlp_Llama-3-Instruct-8B-ORPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2 | 3ea5c542a3d8d61f6afb6cdbef5972a501ddf759 | 25.852852 | 1 | 8 | true | false | false | true | 0.594232 | 0.763321 | 76.332132 | 0.507835 | 29.604837 | 0.095166 | 9.516616 | 0.283557 | 4.474273 | 0.377969 | 4.846094 | 0.373088 | 30.343159 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-RDPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-RDPO | 9497ca226a68981f42df2e5b3a4a1a2ea702a942 | 22.584117 | 0 | 8 | true | false | false | true | 0.56625 | 0.666002 | 66.600176 | 0.503363 | 29.032479 | 0.023414 | 2.34139 | 0.282718 | 4.362416 | 0.375208 | 4.201042 | 0.360705 | 28.967199 | false | false | 2024-05-17 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-RDPO |
|
princeton-nlp_Llama-3-Instruct-8B-RDPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2 | 4e5bc9779cba3a2f615379d3f8ef1bbb3ea487f7 | 24.427995 | 1 | 8 | true | false | false | true | 0.557948 | 0.707692 | 70.769226 | 0.504922 | 28.854277 | 0.050604 | 5.060423 | 0.292785 | 5.704698 | 0.380448 | 5.35599 | 0.37741 | 30.82336 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-RRHF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-RRHF | 73561d9b0fd42b94250246f8d794251fe9f9d2e9 | 24.059318 | 0 | 8 | true | false | false | true | 0.639216 | 0.727451 | 72.745094 | 0.491055 | 27.216485 | 0.095166 | 9.516616 | 0.280201 | 4.026846 | 0.347552 | 1.477344 | 0.364362 | 29.373522 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-RRHF |
|
princeton-nlp_Llama-3-Instruct-8B-RRHF-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 | 81191fbb214d17f0a4fec247da5d648f4cb61ef1 | 23.753751 | 0 | 8 | true | false | false | true | 0.505873 | 0.712488 | 71.248842 | 0.49839 | 28.498724 | 0.087613 | 8.761329 | 0.260067 | 1.342282 | 0.373781 | 5.089323 | 0.348238 | 27.582004 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF | 7e9001f6f4fe940c363bb7ea1814d33c79b21737 | 25.056382 | 0 | 8 | true | false | false | true | 0.725192 | 0.739966 | 73.996551 | 0.502942 | 29.211612 | 0.082326 | 8.232628 | 0.286074 | 4.809843 | 0.372292 | 5.369792 | 0.358461 | 28.717863 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF |
|
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2 | 1821cc42189d8dab9e157c31b223dc60fc037c2d | 23.728355 | 0 | 8 | true | false | false | true | 0.521239 | 0.710965 | 71.096468 | 0.49839 | 28.498724 | 0.087613 | 8.761329 | 0.260067 | 1.342282 | 0.373781 | 5.089323 | 0.348238 | 27.582004 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SimPO | f700cb6afb4509b10dea43ab72bb0e260e166be4 | 22.657116 | 55 | 8 | true | false | false | true | 0.533346 | 0.65039 | 65.038985 | 0.484468 | 26.709133 | 0.02568 | 2.567976 | 0.293624 | 5.816555 | 0.394833 | 8.154167 | 0.348903 | 27.655881 | false | false | 2024-05-17 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-SimPO |
|
princeton-nlp_Llama-3-Instruct-8B-SimPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2 | 9ac0fbee445e7755e50520e9881d67588b4b854c | 24.474601 | 5 | 8 | true | false | false | true | 0.579982 | 0.680865 | 68.086455 | 0.503834 | 29.214022 | 0.057402 | 5.740181 | 0.301174 | 6.823266 | 0.398802 | 7.85026 | 0.362201 | 29.133422 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2 |
|
princeton-nlp_Mistral-7B-Base-SFT-CPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-CPO | 7f67394668b94a9ddfb64daff8976b48b135d96c | 17.373794 | 1 | 7 | true | false | false | true | 0.809769 | 0.465493 | 46.549267 | 0.438215 | 21.857696 | 0.026435 | 2.643505 | 0.291946 | 5.592841 | 0.407083 | 9.252083 | 0.265126 | 18.34737 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-CPO |
|
princeton-nlp_Mistral-7B-Base-SFT-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-DPO | 17134fd80cfbf3980353967a30dc6f450f18f78f | 16.236325 | 0 | 7 | true | false | false | true | 0.66762 | 0.440338 | 44.03383 | 0.435011 | 20.79098 | 0.016616 | 1.661631 | 0.272651 | 3.020134 | 0.412229 | 9.628646 | 0.264545 | 18.282728 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-DPO |
|
princeton-nlp_Mistral-7B-Base-SFT-IPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-IPO | eea781724e4d2ab8bdda7c13526f042de4cfae41 | 17.210428 | 0 | 7 | true | false | false | true | 0.667334 | 0.482953 | 48.295301 | 0.445802 | 23.703491 | 0.024924 | 2.492447 | 0.280201 | 4.026846 | 0.377625 | 4.836458 | 0.279172 | 19.908023 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-IPO |
|
princeton-nlp_Mistral-7B-Base-SFT-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-KTO | 02148bb9241b0f4bb0c75e93893eed005abe25e8 | 18.96264 | 0 | 7 | true | false | false | true | 0.666017 | 0.478482 | 47.848154 | 0.447643 | 23.107642 | 0.036254 | 3.625378 | 0.290268 | 5.369128 | 0.436781 | 13.03099 | 0.287151 | 20.794548 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-KTO |
|
princeton-nlp_Mistral-7B-Base-SFT-RDPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-RDPO | 2a63a6d9e1978c99444e440371268f7c2b7e0375 | 16.465757 | 0 | 7 | true | false | false | true | 0.662505 | 0.460647 | 46.064664 | 0.443953 | 22.98201 | 0.020393 | 2.039275 | 0.277685 | 3.691275 | 0.357938 | 4.275521 | 0.277676 | 19.7418 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-RDPO |
|
princeton-nlp_Mistral-7B-Base-SFT-RRHF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-RRHF | 0d5861072e9d01f420451bf6a5b108bc8d3a76bc | 16.194613 | 0 | 7 | true | false | false | true | 0.669001 | 0.440663 | 44.0663 | 0.428059 | 19.598831 | 0.02568 | 2.567976 | 0.290268 | 5.369128 | 0.418677 | 10.034635 | 0.239777 | 15.530807 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-RRHF |
|
princeton-nlp_Mistral-7B-Base-SFT-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF | 65d2cc49ad05258da3d982b39682c7f672f5e4ab | 18.955533 | 0 | 7 | true | false | false | true | 0.668442 | 0.512728 | 51.272845 | 0.44224 | 22.304723 | 0.032477 | 3.247734 | 0.291946 | 5.592841 | 0.426083 | 11.527083 | 0.278092 | 19.787973 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF |
|
princeton-nlp_Mistral-7B-Base-SFT-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-SimPO | 9d9e8b8de4f673d45bc826efc4a1444f9d480222 | 16.893545 | 0 | 7 | true | false | false | true | 0.635706 | 0.470064 | 47.006387 | 0.439805 | 22.332886 | 0.006042 | 0.60423 | 0.283557 | 4.474273 | 0.397063 | 8.032813 | 0.270196 | 18.910683 | false | false | 2024-05-17 | 2024-09-21 | 0 | princeton-nlp/Mistral-7B-Base-SFT-SimPO |
|
princeton-nlp_Mistral-7B-Instruct-CPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-CPO | 32492f8e5588f06005689ac944c2ea39c394c28e | 15.565535 | 0 | 7 | true | false | false | true | 0.645922 | 0.420305 | 42.030479 | 0.406922 | 17.248538 | 0.021903 | 2.190332 | 0.26594 | 2.12528 | 0.417844 | 10.897135 | 0.270113 | 18.901448 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-CPO |
|
princeton-nlp_Mistral-7B-Instruct-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-DPO | 5e96cff70d8db87cf17c616429c17c8dc9352543 | 16.549607 | 0 | 7 | true | false | false | true | 0.605267 | 0.517624 | 51.762435 | 0.406036 | 16.875389 | 0.030211 | 3.021148 | 0.268456 | 2.46085 | 0.383333 | 5.75 | 0.27485 | 19.427822 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-DPO |
|
princeton-nlp_Mistral-7B-Instruct-IPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-IPO | 32ad99c6e7231bbe8ebd9d24b28e084c60848558 | 17.707096 | 0 | 7 | true | false | false | true | 0.625748 | 0.49292 | 49.29199 | 0.432218 | 20.09411 | 0.019637 | 1.963746 | 0.27349 | 3.131991 | 0.432417 | 12.785417 | 0.270778 | 18.975325 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-IPO |
|
princeton-nlp_Mistral-7B-Instruct-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-KTO | 834422e5b9b9eee6aac2f8d4822b925a6574d628 | 16.664827 | 0 | 7 | true | false | false | true | 0.603378 | 0.490797 | 49.079664 | 0.413959 | 17.812648 | 0.024169 | 2.416918 | 0.27349 | 3.131991 | 0.395271 | 7.408854 | 0.28125 | 20.138889 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-KTO |
|
princeton-nlp_Mistral-7B-Instruct-ORPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-ORPO | 69c0481f4100629a49ae73f760ddbb61d8e98e48 | 16.050529 | 0 | 7 | true | false | false | true | 0.624297 | 0.471962 | 47.196217 | 0.410406 | 18.038373 | 0.02719 | 2.719033 | 0.274329 | 3.243848 | 0.39124 | 6.638281 | 0.266207 | 18.46742 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-ORPO |
|
princeton-nlp_Mistral-7B-Instruct-RDPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-RDPO | 23ec6ab4f996134eb15c19322dabb34d7332d7cd | 16.420491 | 0 | 7 | true | false | false | true | 0.610616 | 0.488723 | 48.872325 | 0.405015 | 17.048388 | 0.024169 | 2.416918 | 0.280201 | 4.026846 | 0.387333 | 6.416667 | 0.277676 | 19.7418 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-RDPO |
|
princeton-nlp_Mistral-7B-Instruct-RRHF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-RRHF | 493d3ceb571232fe3b2f55c0bf78692760f4fc7e | 16.829083 | 0 | 7 | true | false | false | true | 0.587751 | 0.496017 | 49.601723 | 0.418977 | 19.206552 | 0.024169 | 2.416918 | 0.276007 | 3.467562 | 0.397875 | 7.934375 | 0.265126 | 18.34737 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-RRHF |
|
princeton-nlp_Mistral-7B-Instruct-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-SLiC-HF | 3d08c8b7c3e73beb2a3264848f17246b74c3d162 | 16.376556 | 0 | 7 | true | false | false | true | 0.622453 | 0.511529 | 51.152941 | 0.404001 | 16.653429 | 0.016616 | 1.661631 | 0.272651 | 3.020134 | 0.391302 | 6.71276 | 0.271526 | 19.058437 | false | false | 2024-07-06 | 2024-10-16 | 0 | princeton-nlp/Mistral-7B-Instruct-SLiC-HF |
|
princeton-nlp_Mistral-7B-Instruct-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-SimPO | 03191ee1e60d21a698d11a515703a037073724f8 | 17.569551 | 1 | 7 | false | false | false | true | 0.570562 | 0.46869 | 46.868974 | 0.450723 | 22.382277 | 0.026435 | 2.643505 | 0.278523 | 3.803132 | 0.409781 | 9.75599 | 0.279671 | 19.963431 | false | false | 2024-05-24 | 2024-09-21 | 0 | princeton-nlp/Mistral-7B-Instruct-SimPO |
|
princeton-nlp_Sheared-LLaMA-1.3B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Sheared-LLaMA-1.3B | a4b76938edbf571ea7d7d9904861cbdca08809b4 | 5.505397 | apache-2.0 | 91 | 1 | true | false | false | false | 0.3546 | 0.21977 | 21.977021 | 0.319705 | 4.74463 | 0.008308 | 0.830816 | 0.239933 | 0 | 0.371302 | 3.579427 | 0.117104 | 1.900488 | false | false | 2023-10-10 | 2024-07-29 | 0 | princeton-nlp/Sheared-LLaMA-1.3B |
princeton-nlp_Sheared-LLaMA-2.7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Sheared-LLaMA-2.7B | 2f157a0306b75d37694ae05f6a4067220254d540 | 6.324627 | apache-2.0 | 60 | 2 | true | false | false | false | 0.47005 | 0.241652 | 24.165215 | 0.325869 | 5.655521 | 0.006042 | 0.60423 | 0.275168 | 3.355705 | 0.356729 | 2.091146 | 0.118684 | 2.075946 | false | false | 2023-10-10 | 2024-07-29 | 0 | princeton-nlp/Sheared-LLaMA-2.7B |
princeton-nlp_gemma-2-9b-it-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/gemma-2-9b-it-DPO | f646c99fc3aa7afc7b22c3c7115fd03a40fc1d22 | 19.434035 | 5 | 9 | false | false | false | true | 2.890627 | 0.276872 | 27.687203 | 0.594144 | 41.593654 | 0 | 0 | 0.33557 | 11.409396 | 0.382031 | 5.653906 | 0.37234 | 30.260047 | false | false | 2024-07-16 | 2024-09-19 | 2 | google/gemma-2-9b |
|
princeton-nlp_gemma-2-9b-it-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/gemma-2-9b-it-SimPO | 8c87091f412e3aa6f74f66bd86c57fb81cbc3fde | 21.161652 | mit | 129 | 9 | true | false | false | true | 2.769004 | 0.320686 | 32.068578 | 0.583918 | 40.09343 | 0 | 0 | 0.33557 | 11.409396 | 0.412323 | 10.340365 | 0.397523 | 33.058141 | false | false | 2024-07-16 | 2024-08-10 | 2 | google/gemma-2-9b |
pszemraj_Llama-3-6.3b-v0.1_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pszemraj/Llama-3-6.3b-v0.1 | 7000b39346162f95f19aa4ca3975242db61902d7 | 10.333954 | llama3 | 6 | 6 | true | false | false | false | 0.814463 | 0.10439 | 10.438969 | 0.419681 | 18.679996 | 0.018127 | 1.812689 | 0.283557 | 4.474273 | 0.390833 | 6.154167 | 0.283993 | 20.443632 | false | false | 2024-05-17 | 2024-06-26 | 1 | meta-llama/Meta-Llama-3-8B |
pszemraj_Mistral-v0.3-6B_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/pszemraj/Mistral-v0.3-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Mistral-v0.3-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Mistral-v0.3-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pszemraj/Mistral-v0.3-6B | ae11a699012b83996361f04808f4d45debf3b01c | 10.046851 | apache-2.0 | 1 | 5 | true | false | false | false | 0.530539 | 0.245374 | 24.53745 | 0.377405 | 13.515091 | 0.009063 | 0.906344 | 0.265101 | 2.013423 | 0.390771 | 6.613021 | 0.214262 | 12.695774 | false | false | 2024-05-25 | 2024-06-26 | 2 | pszemraj/Mistral-7B-v0.3-prune6 (Merge) |
qingy2019_LLaMa_3.2_3B_Catalysts_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/LLaMa_3.2_3B_Catalysts" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/LLaMa_3.2_3B_Catalysts</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__LLaMa_3.2_3B_Catalysts-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/LLaMa_3.2_3B_Catalysts | 3f4a318114beb37f32a2c143cbd68b6d15d18164 | 19.628816 | apache-2.0 | 1 | 3 | true | false | false | false | 0.649834 | 0.49924 | 49.923979 | 0.446813 | 21.345401 | 0.111027 | 11.102719 | 0.288591 | 5.145414 | 0.378771 | 7.946354 | 0.300781 | 22.309028 | false | false | 2024-10-19 | 2024-10-29 | 2 | meta-llama/Llama-3.2-3B-Instruct |
qingy2019_OpenMath2-Llama3.1-8B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/OpenMath2-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/OpenMath2-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__OpenMath2-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/OpenMath2-Llama3.1-8B | 38412f988f7688d884c9249b2a4e5cc76f98c1c6 | 8.987818 | 0 | 8 | false | false | false | false | 0.692806 | 0.233059 | 23.305939 | 0.409552 | 16.29437 | 0.041541 | 4.154079 | 0.265101 | 2.013423 | 0.343552 | 2.010677 | 0.155336 | 6.148419 | false | false | 2024-11-23 | 0 | Removed |
||
qingy2019_Oracle-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Oracle-14B | 0154031aa9306aa98da156a0f3c8e10d9f1377f6 | 13.34025 | 0 | 13 | false | false | false | false | 1.393024 | 0.235832 | 23.583204 | 0.461158 | 23.18463 | 0.064199 | 6.41994 | 0.25755 | 1.006711 | 0.371667 | 10.491667 | 0.238198 | 15.355349 | false | false | 2024-11-23 | 0 | Removed |
||
qingy2019_Oracle-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Oracle-14B | 0154031aa9306aa98da156a0f3c8e10d9f1377f6 | 13.479724 | 0 | 13 | false | false | false | false | 1.368887 | 0.240079 | 24.007855 | 0.46223 | 23.301946 | 0.06571 | 6.570997 | 0.260906 | 1.454139 | 0.370333 | 10.225 | 0.237866 | 15.31841 | false | false | 2024-11-24 | 0 | Removed |
||
qingy2019_Qwen2.5-Math-14B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct | 025d9637208b862c7b10b7590969fe6870ce01a0 | 36.70629 | apache-2.0 | 0 | 14 | true | false | false | false | 1.932827 | 0.606626 | 60.662597 | 0.635007 | 47.017086 | 0.284743 | 28.47432 | 0.372483 | 16.331096 | 0.475729 | 19.632812 | 0.533078 | 48.119829 | false | false | 2024-12-01 | 2024-12-01 | 3 | Qwen/Qwen2.5-14B |
qingy2019_Qwen2.5-Math-14B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct | 025d9637208b862c7b10b7590969fe6870ce01a0 | 36.380504 | apache-2.0 | 0 | 14 | true | false | false | false | 1.971893 | 0.600531 | 60.053104 | 0.635649 | 47.065572 | 0.276435 | 27.643505 | 0.369128 | 15.883669 | 0.475667 | 19.425 | 0.53391 | 48.212175 | false | false | 2024-12-01 | 2024-12-01 | 3 | Qwen/Qwen2.5-14B |
qingy2019_Qwen2.5-Math-14B-Instruct-Alpha_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct-Alpha | e24aaa0779b576301bfb62b93789dea24ab10c88 | 35.456012 | apache-2.0 | 0 | 14 | true | false | false | false | 1.893142 | 0.598083 | 59.808309 | 0.637508 | 47.750108 | 0.231118 | 23.111782 | 0.369966 | 15.995526 | 0.464938 | 17.950521 | 0.533078 | 48.119829 | false | false | 2024-12-03 | 2024-12-03 | 2 | Qwen/Qwen2.5-14B |
qingy2019_Qwen2.5-Math-14B-Instruct-Pro_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Pro" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Pro</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Pro-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct-Pro | 295a9ce370c2bfeabe13f76d52c92f57ff6d0308 | 19.70776 | 0 | 14 | false | false | false | true | 1.659569 | 0.192168 | 19.216789 | 0.531869 | 33.036904 | 0.251511 | 25.151057 | 0.311242 | 8.165548 | 0.374031 | 4.253906 | 0.355801 | 28.422355 | false | false | 2024-12-03 | 2024-12-03 | 1 | qingy2019/Qwen2.5-Math-14B-Instruct-Pro (Merge) |
|
qingy2019_Qwen2.5-Ultimate-14B-Instruct_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Ultimate-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Ultimate-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Ultimate-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Ultimate-14B-Instruct | 3eeba743112bed957ae6dc6a3f880355c8bedb66 | 29.289124 | 1 | 14 | false | false | false | true | 1.952089 | 0.393802 | 39.380178 | 0.584156 | 40.580601 | 0.280211 | 28.021148 | 0.356544 | 14.205817 | 0.4135 | 9.8875 | 0.492936 | 43.659501 | false | false | 2024-12-02 | 2024-12-02 | 1 | qingy2019/Qwen2.5-Ultimate-14B-Instruct (Merge) |
|
qingy2024_Fusion-14B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2024/Fusion-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2024/Fusion-14B-Instruct | 2e15219659b919e04ad5b56bef259489cc264f09 | 37.64425 | 1 | 14 | false | false | false | true | 1.623681 | 0.725977 | 72.597707 | 0.639593 | 48.579836 | 0.309668 | 30.966767 | 0.354866 | 13.982103 | 0.440042 | 14.805208 | 0.504405 | 44.93388 | false | false | 2024-12-05 | 2024-12-05 | 1 | qingy2024/Fusion-14B-Instruct (Merge) |
|
qingy2024_Fusion2-14B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2024/Fusion2-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion2-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion2-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2024/Fusion2-14B-Instruct | df00288ce3d37ef518189c19e7973e71b47ef214 | 35.182268 | 0 | 14 | false | false | false | true | 1.668666 | 0.606401 | 60.640102 | 0.611852 | 44.767044 | 0.308157 | 30.81571 | 0.344799 | 12.639821 | 0.463385 | 17.223177 | 0.50507 | 45.007757 | false | false | 2024-12-05 | 2024-12-06 | 1 | qingy2024/Fusion2-14B-Instruct (Merge) |
|
qingy2024_Qwen2.6-14B-Instruct_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.6-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.6-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.6-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2024/Qwen2.6-14B-Instruct | c21acf3c074e9522c5d0559ccc4ed715c48b8eff | 35.624979 | 0 | 14 | false | false | false | false | 1.789286 | 0.581097 | 58.109704 | 0.639414 | 48.047948 | 0.267372 | 26.73716 | 0.379195 | 17.225951 | 0.456938 | 16.017188 | 0.528507 | 47.611924 | false | false | 2024-12-04 | 2024-12-04 | 1 | qingy2024/Qwen2.6-14B-Instruct (Merge) |
|
qingy2024_Qwen2.6-Math-14B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.6-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.6-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.6-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2024/Qwen2.6-Math-14B-Instruct | 45bb3f302922fbf185694bba2748a32ca3313a5e | 28.046404 | apache-2.0 | 0 | 14 | true | false | false | false | 1.556072 | 0.386232 | 38.623186 | 0.632444 | 47.022117 | 0 | 0 | 0.369966 | 15.995526 | 0.475854 | 19.515104 | 0.524102 | 47.122488 | false | false | 2024-12-04 | 2024-12-04 | 3 | Qwen/Qwen2.5-14B |
qq8933_OpenLongCoT-Base-Gemma2-2B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/qq8933/OpenLongCoT-Base-Gemma2-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qq8933/OpenLongCoT-Base-Gemma2-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qq8933__OpenLongCoT-Base-Gemma2-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qq8933/OpenLongCoT-Base-Gemma2-2B | 39e5bc941f107ac28142c802aecfd257cc47c1bb | 5.08291 | other | 8 | 3 | true | false | false | true | 1.658487 | 0.196514 | 19.651414 | 0.310636 | 3.546298 | 0 | 0 | 0.262584 | 1.677852 | 0.32225 | 2.114583 | 0.131566 | 3.507314 | false | false | 2024-10-28 | 2024-11-12 | 2 | google/gemma-2-2b |
rasyosef_Mistral-NeMo-Minitron-8B-Chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/rasyosef/Mistral-NeMo-Minitron-8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Mistral-NeMo-Minitron-8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Mistral-NeMo-Minitron-8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | rasyosef/Mistral-NeMo-Minitron-8B-Chat | cede47eac8a4e65aa27567d3f087c28185b537d9 | 17.230946 | other | 8 | 8 | true | false | false | true | 1.476398 | 0.445184 | 44.518433 | 0.475944 | 26.036695 | 0.008308 | 0.830816 | 0.276007 | 3.467562 | 0.430427 | 12.936719 | 0.240359 | 15.595449 | false | false | 2024-08-26 | 2024-08-26 | 1 | nvidia/Mistral-NeMo-Minitron-8B-Base |
rasyosef_Phi-1_5-Instruct-v0.1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | PhiForCausalLM | <a target="_blank" href="https://huggingface.co/rasyosef/Phi-1_5-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Phi-1_5-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Phi-1_5-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | rasyosef/Phi-1_5-Instruct-v0.1 | f4c405ee4bff5dc1a69383f3fe682342c9c87c77 | 6.638162 | mit | 1 | 1 | true | false | false | true | 0.295022 | 0.240228 | 24.022815 | 0.31179 | 4.820244 | 0 | 0 | 0.260067 | 1.342282 | 0.342156 | 3.402865 | 0.156167 | 6.240765 | false | false | 2024-07-24 | 2024-07-25 | 1 | microsoft/phi-1_5 |
rasyosef_phi-2-instruct-apo_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | PhiForCausalLM | <a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-apo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-apo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-apo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | rasyosef/phi-2-instruct-apo | 2d3722d6db77a8c844a50dd32ddc4278fdc89e1f | 12.043528 | mit | 0 | 2 | true | false | false | true | 0.495065 | 0.314592 | 31.459195 | 0.44451 | 21.672438 | 0 | 0 | 0.270134 | 2.684564 | 0.334219 | 3.610677 | 0.215509 | 12.834294 | false | false | 2024-09-15 | 2024-09-17 | 1 | microsoft/phi-2 |
rasyosef_phi-2-instruct-v0.1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | PhiForCausalLM | <a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | rasyosef/phi-2-instruct-v0.1 | 29aeb3ccf7c79e0169a038fbd0deaf9772a9fefd | 14.218631 | mit | 2 | 2 | true | false | false | true | 0.492726 | 0.368148 | 36.814763 | 0.472612 | 26.358802 | 0 | 0 | 0.274329 | 3.243848 | 0.352354 | 5.044271 | 0.224651 | 13.850103 | false | false | 2024-08-09 | 2024-08-10 | 1 | microsoft/phi-2 |
realtreetune_rho-1b-sft-MATH_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/realtreetune/rho-1b-sft-MATH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">realtreetune/rho-1b-sft-MATH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/realtreetune__rho-1b-sft-MATH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | realtreetune/rho-1b-sft-MATH | b5f93df6af679a860caac9a9598e0f70c326b4fb | 5.355177 | 0 | 1 | false | false | false | false | 0.278134 | 0.212102 | 21.210167 | 0.314415 | 4.197623 | 0.021903 | 2.190332 | 0.252517 | 0.33557 | 0.345844 | 2.897135 | 0.111702 | 1.300236 | false | false | 2024-06-06 | 2024-10-05 | 1 | realtreetune/rho-1b-sft-MATH (Merge) |
|
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp | 9048af8616bc62b6efab2bc1bc77ba53c5dfed79 | 29.873992 | apache-2.0 | 3 | 10 | true | false | false | true | 2.114373 | 0.764895 | 76.489492 | 0.597439 | 42.25121 | 0.017372 | 1.73716 | 0.330537 | 10.738255 | 0.424479 | 12.393229 | 0.420711 | 35.634604 | false | false | 2024-09-11 | 2024-09-12 | 0 | recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp |
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp | 5a4f7299d9f8ea5faad2b1edc68b7bf634dac40b | 23.205618 | apache-2.0 | 3 | 10 | true | false | false | false | 2.969828 | 0.285365 | 28.536505 | 0.598393 | 42.703798 | 0.058157 | 5.81571 | 0.329698 | 10.626398 | 0.460656 | 16.415365 | 0.416223 | 35.135934 | false | false | 2024-09-11 | 2024-09-27 | 0 | recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp |
recoilme_recoilme-gemma-2-9B-v0.1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/recoilme-gemma-2-9B-v0.1 | 6dc0997046db4e9932f87d338ecdc2a4158abbda | 29.602746 | 0 | 10 | false | false | false | true | 1.924809 | 0.751506 | 75.1506 | 0.599531 | 42.321861 | 0.016616 | 1.661631 | 0.338926 | 11.856823 | 0.419146 | 11.526563 | 0.415891 | 35.098995 | false | false | 2024-09-18 | 0 | Removed |
||
recoilme_recoilme-gemma-2-9B-v0.2_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/recoilme-gemma-2-9B-v0.2 | 483116e575fb3a56de25243b14d715c58fe127bc | 30.048864 | cc-by-nc-4.0 | 1 | 10 | true | false | false | true | 1.914086 | 0.759175 | 75.917455 | 0.602596 | 43.027969 | 0.05287 | 5.287009 | 0.328859 | 10.514541 | 0.409875 | 10.401042 | 0.416307 | 35.145168 | false | false | 2024-09-18 | 2024-09-18 | 0 | recoilme/recoilme-gemma-2-9B-v0.2 |
recoilme_recoilme-gemma-2-9B-v0.2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/recoilme-gemma-2-9B-v0.2 | 483116e575fb3a56de25243b14d715c58fe127bc | 23.674735 | cc-by-nc-4.0 | 1 | 10 | true | false | false | false | 2.946784 | 0.274699 | 27.469891 | 0.603083 | 43.560581 | 0.077795 | 7.779456 | 0.330537 | 10.738255 | 0.468594 | 17.807552 | 0.412234 | 34.692671 | false | false | 2024-09-18 | 2024-09-27 | 0 | recoilme/recoilme-gemma-2-9B-v0.2 |
recoilme_recoilme-gemma-2-9B-v0.3_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/recoilme-gemma-2-9B-v0.3 | 772cab46d9d22cbcc3c574d193021803ce5c444c | 30.207472 | cc-by-nc-4.0 | 3 | 10 | true | false | false | true | 1.876637 | 0.743937 | 74.39372 | 0.599253 | 42.026279 | 0.087613 | 8.761329 | 0.323826 | 9.8434 | 0.420385 | 12.08151 | 0.407247 | 34.138593 | false | false | 2024-09-18 | 2024-09-18 | 0 | recoilme/recoilme-gemma-2-9B-v0.3 |
recoilme_recoilme-gemma-2-9B-v0.3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/recoilme-gemma-2-9B-v0.3 | 76c8fb761660e6eb237c91bb6e6761ee36266bba | 30.111638 | cc-by-nc-4.0 | 3 | 10 | true | false | false | false | 2.55535 | 0.576076 | 57.607592 | 0.601983 | 43.326868 | 0.172961 | 17.296073 | 0.337248 | 11.63311 | 0.463229 | 17.036979 | 0.403923 | 33.769208 | false | false | 2024-09-18 | 2024-09-27 | 0 | recoilme/recoilme-gemma-2-9B-v0.3 |
recoilme_recoilme-gemma-2-9B-v0.4_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | recoilme/recoilme-gemma-2-9B-v0.4 | 2691f2cc8d80072f15d78cb7ae72831e1a12139e | 24.100363 | cc-by-nc-4.0 | 2 | 10 | true | false | false | false | 2.91891 | 0.256189 | 25.618913 | 0.596729 | 42.442482 | 0.082326 | 8.232628 | 0.340604 | 12.080537 | 0.472688 | 18.385938 | 0.440575 | 37.841681 | false | false | 2024-09-18 | 2024-09-19 | 0 | recoilme/recoilme-gemma-2-9B-v0.4 |
Subsets and Splits