File size: 1,027 Bytes
65e1287 bd0afe4 65e1287 bd0afe4 65e1287 99c7ee6 65e1287 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 |
---
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
tags:
- merge
- mergekit
---
base_model:
- HuggingFaceH4/zephyr-7b-beta
# Merge Details
### Merge Method
This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) as a base.
### Models Merged
The following models were included in the merge:
* [NousResearch/Hermes-2-Pro-Mistral-7B](https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: bfloat16
merge_method: task_arithmetic
base_model:
model: HuggingFaceH4/zephyr-7b-beta
slices:
- sources:
- layer_range: [0, 32]
model: HuggingFaceH4/zephyr-7b-beta
parameters:
weight: 0.7
- layer_range: [0, 32]
model: NousResearch/Hermes-2-Pro-Mistral-7B
parameters:
weight: 0.5
``` |