LoneStriker's picture
Upload folder using huggingface_hub
5f034e1 verified
metadata
license: apache-2.0

Blue-Orchid-2x7b

GGUF: https://huggingface.co/nakodanei/Blue-Orchid-2x7b_GGUF

Roleplaying focused MoE Mistral model.

One expert is a merge of mostly RP models, the other is a merge of mostly storywriting models. So it should be good at both. The base model is SanjiWatsuki/Kunoichi-DPO-v2-7B.

  • Expert 1 is a merge of LimaRP, Limamono, Noromaid 0.4 DPO and good-robot.
  • Expert 2 is a merge of Erebus, Holodeck, Dans-AdventurousWinds-Mk2, Opus, Ashhwriter and good-robot.

Prompt template (LimaRP):

### Instruction:
{system prompt}

### Input:
User: {prompt}

### Response:
Character: 

Alpaca prompt template should work fine too.