base_model: | |
- Open-Orca/Mistral-7B-OpenOrca | |
- mlabonne/NeuralBeagle14-7B | |
tags: | |
- mergekit | |
- merge | |
# merge | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method. | |
### Models Merged | |
The following models were included in the merge: | |
* [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca) | |
* [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: mlabonne/NeuralBeagle14-7B | |
parameters: | |
weight: 0.7 | |
- model: Open-Orca/Mistral-7B-OpenOrca | |
parameters: | |
weight: 0.3 | |
merge_method: linear | |
dtype: float16 | |
# slices: | |
# - sources: | |
# - model: Open-Orca/Mistral-7B-OpenOrca | |
# layer_range: [0, 32] | |
# - model: mlabonne/NeuralBeagle14-7B | |
# layer_range: [0, 32] | |
# merge_method: slerp | |
# base_model: Open-Orca/Mistral-7B-OpenOrca | |
# parameters: | |
# t: | |
# - filter: self_attn | |
# value: [0, 0.5, 0.3, 0.7, 1] | |
# - filter: mlp | |
# value: [1, 0.5, 0.7, 0.3, 0] | |
# - value: 0.5 | |
# dtype: bfloat16 | |
``` | |