base_model: | |
- abacusai/Smaugv0.1 | |
- NousResearch/Nous-Hermes-2-Yi-34B | |
- jondurbin/bagel-34b-v0.2 | |
- 01-ai/Yi-34B-200K | |
tags: | |
- mergekit | |
- merge | |
# queen | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [01-ai/Yi-34B-200K](https://huggingface.co/01-ai/Yi-34B-200K) as a base. | |
### Models Merged | |
The following models were included in the merge: | |
* [abacusai/Smaugv0.1](https://huggingface.co/abacusai/Smaugv0.1) | |
* [NousResearch/Nous-Hermes-2-Yi-34B](https://huggingface.co/NousResearch/Nous-Hermes-2-Yi-34B) | |
* [jondurbin/bagel-34b-v0.2](https://huggingface.co/jondurbin/bagel-34b-v0.2) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: 01-ai/Yi-34B-200K | |
# No parameters necessary for base model | |
- model: abacusai/Smaugv0.1 | |
parameters: | |
density: 0.53 | |
weight: 0.3 | |
- model: jondurbin/bagel-34b-v0.2 | |
parameters: | |
density: 0.53 | |
weight: 0.3 | |
- model: NousResearch/Nous-Hermes-2-Yi-34B | |
parameters: | |
density: 0.53 | |
weight: 0.4 | |
merge_method: dare_ties | |
base_model: 01-ai/Yi-34B-200K | |
parameters: | |
int8_mask: true | |
dtype: bfloat16 | |
``` | |