base_model: | |
- bamec66557/VICIOUS_MESH-12B-ALPHA | |
- Infermatic/MN-12B-Inferor-v0.1 | |
- Khetterman/DarkAtom-12B-v3 | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
# merge | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [Khetterman/DarkAtom-12B-v3](https://huggingface.co/Khetterman/DarkAtom-12B-v3) as a base. | |
### Models Merged | |
The following models were included in the merge: | |
* [bamec66557/VICIOUS_MESH-12B-ALPHA](https://huggingface.co/bamec66557/VICIOUS_MESH-12B-ALPHA) | |
* [Infermatic/MN-12B-Inferor-v0.1](https://huggingface.co/Infermatic/MN-12B-Inferor-v0.1) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: Khetterman/DarkAtom-12B-v3 | |
#no parameters necessary for base model | |
- model: Infermatic/MN-12B-Inferor-v0.1 | |
parameters: | |
density: 0.5 | |
weight: 0.5 | |
- model: bamec66557/VICIOUS_MESH-12B-ALPHA | |
parameters: | |
density: 0.5 | |
weight: 0.5 | |
merge_method: ties | |
base_model: Khetterman/DarkAtom-12B-v3 | |
parameters: | |
normalize: false | |
int8_mask: true | |
dtype: bfloat16 | |
``` | |