tags: | |
- moe | |
- merge | |
license: apache-2.0 | |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ey84O7VrsOnsE7Ra8prgH.jpeg) | |
# mhm-7-3 | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
Merged model based on mistral. created using dare_ties and models from top of openllm leaderboard. | |
Mixed 7 models into 1. 3 times merging. | |
Just an experiment. |