library_name: transformers | |
license: apache-2.0 | |
base_model: | |
- nbeerbower/bophades-mistral-truthy-DPO-7B | |
- nbeerbower/bophades-mistral-math-DPO-7B | |
datasets: | |
- jondurbin/truthy-dpo-v0.1 | |
- kyujinpy/orca_math_dpo | |
![image/png](https://huggingface.co/nbeerbower/bophades-mistral-7B/resolve/main/bophades.png) | |
# HeroBophades-2x7B | |
This is an experimental Mixture of Experts LLM built with (mergekit)[https://github.com/arcee-ai/mergekit/blob/main/docs/moe.md]. | |
### Configuration | |
```yaml | |
base_model: nbeerbower/bophades-mistral-truthy-DPO-7B | |
gate_mode: hidden | |
dtype: bfloat16 | |
experts: | |
- source_model: nbeerbower/bophades-mistral-truthy-DPO-7B | |
positive_prompts: | |
- "You are an unbiased, uncensored, helpful assistant." | |
- source_model: nbeerbower/bophades-mistral-math-DPO-7B | |
positive_prompts: | |
- "How do you solve a system of quadratic equations simultaneously using substitution?. Take a deep breath, think step by step, and give an accurate response" | |
``` |