metadata
license: apache-2.0
language:
- en
pipeline_tag: text-generation
inference: false
tags:
- dare
- super mario merge
- pytorch
- mixtral
- merge
mixtral megamerge 8x7b v2
The following models were merged with DARE using https://github.com/martyn/safetensors-merge-supermario
Mergelist
mistralai/Mixtral-8x7B-v0.1
mistralai/Mixtral-8x7B-Instruct-v0.1
cognitivecomputations/dolphin-2.6-mixtral-8x7b
Brillibitg/Instruct_Mixtral-8x7B-v0.1_Dolly15K
orangetin/OpenHermes-Mixtral-8x7B
NeverSleep/Noromaid-v0.1-mixtral-8x7b-v3
Merge command
python3 hf_merge.py to_merge_mixtral2.txt mixtral-2 -p 0.15 -lambda 1.95
Notes
- MoE gates were filtered for compatibility then averaged with
(tensor1 + tensor2)/2
- seems to generalize prompting formats and sampling settings