Zephyr-Hermes-7B / README.md
theBodhiTree's picture
Update README.md
bd0afe4 verified
|
raw
history blame
1.03 kB
---
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
tags:
- merge
- mergekit
---
base_model:
- HuggingFaceH4/zephyr-7b-beta
# Merge Details
### Merge Method
This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) as a base.
### Models Merged
The following models were included in the merge:
* [NousResearch/Hermes-2-Pro-Mistral-7B](https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: bfloat16
merge_method: task_arithmetic
base_model:
model: HuggingFaceH4/zephyr-7b-beta
slices:
- sources:
- layer_range: [0, 32]
model: HuggingFaceH4/zephyr-7b-beta
parameters:
weight: 0.7
- layer_range: [0, 32]
model: NousResearch/Hermes-2-Pro-Mistral-7B
parameters:
weight: 0.5
```