File size: 452 Bytes
66ff01b 5cd544f 66ff01b 5b5a649 66ff01b 5cd544f 66ff01b 0e83638 5cd544f 0e83638 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
tags:
- moe
- merge
license: apache-2.0
---
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ey84O7VrsOnsE7Ra8prgH.jpeg)
# mhm-7-3
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
Merged model based on mistral. created using dare_ties and models from top of openllm leaderboard.
Mixed 7 models into 1. 3 times merging.
Just an experiment. |