File size: 1,157 Bytes
54db2d3 5302775 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
license: apache-2.0
---
MOE with Mergekit/Mixtral:
base_model: mistralai/Mistral-7B-Instruct-v0.2
gate_mode: hidden # one of "hidden", "cheap_embed", or "random"
dtype: bfloat16 # output dtype (float32, float16, or bfloat16)
experts:
- source_model: SanjiWatsuki/Silicon-Maid-7B
positive_prompts:
- "roleplay"
- "story telling"
- "fantasy"
- "dreaming"
- source_model: teknium/OpenHermes-2.5-Mistral-7B
positive_prompts:
- "chat"
- "flow chart"
- "diagrams"
- "reasoning"
- "explanation"
- source_model: Nondzu/Mistral-7B-Instruct-v0.2-code-ft
positive_prompts:
- "programming"
- "code debugging"
- "data transformation"
- "data structures"
negative_prompt:
- "chat"
- source_model: meta-math/MetaMath-Mistral-7B
positive_prompts:
- "math"
- "arithmetic"
- "algebra"
chatml prompt |