mhm-7b-v1.3 / README.md
h2m's picture
Update README.md (#1)
5cd544f verified
metadata
tags:
  - moe
  - merge
license: apache-2.0

image/jpeg

mhm-7-3

This is a merge of pre-trained language models created using mergekit.

Merged model based on mistral. created using dare_ties and models from top of openllm leaderboard.

Mixed 7 models into 1. 3 times merging.

Just an experiment.