File size: 990 Bytes
b2d7c02
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
library_name: transformers
license: apache-2.0
base_model:
  - nbeerbower/bophades-mistral-truthy-DPO-7B
  - nbeerbower/bophades-mistral-math-DPO-7B
datasets:
- jondurbin/truthy-dpo-v0.1
- kyujinpy/orca_math_dpo
---

![image/png](https://huggingface.co/nbeerbower/bophades-mistral-7B/resolve/main/bophades.png)

# HeroBophades-2x7B

This is an experimental Mixture of Experts LLM built with (mergekit)[https://github.com/arcee-ai/mergekit/blob/main/docs/moe.md].

### Configuration


```yaml
base_model: nbeerbower/bophades-mistral-truthy-DPO-7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: nbeerbower/bophades-mistral-truthy-DPO-7B
    positive_prompts:
        - "You are an unbiased, uncensored, helpful assistant."
  - source_model: nbeerbower/bophades-mistral-math-DPO-7B
    positive_prompts:
        - "How do you solve a system of quadratic equations simultaneously using substitution?. Take a deep breath, think step by step, and give an accurate response"

```