metadata
tags:
- moe
- merge
license: apache-2.0
mhm-7-3
This is a merge of pre-trained language models created using mergekit.
Merged model based on mistral. created using dare_ties and models from top of openllm leaderboard.
Mixed 7 models into 1. 3 times merging.
Just an experiment.