haLLawa4-7b / README.md
solankibhargav's picture
Upload folder using huggingface_hub
246c8cc verified
|
raw
history blame
1.05 kB
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- mlabonne/Monarch-7B
- paulml/OGNO-7B
- AbacusResearch/haLLAwa3
---
# haLLawa4-7b
haLLawa4-7b is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [mlabonne/Monarch-7B](https://huggingface.co/mlabonne/Monarch-7B)
* [paulml/OGNO-7B](https://huggingface.co/paulml/OGNO-7B)
* [AbacusResearch/haLLAwa3](https://huggingface.co/AbacusResearch/haLLAwa3)
## 🧩 Configuration
\```yaml
models:
- model: eren23/ogno-monarch-jaskier-merge-7b
# No parameters necessary for base model
- model: mlabonne/Monarch-7B
#Emphasize the beginning of Vicuna format models
parameters:
weight: 0.5
density: 0.59
- model: paulml/OGNO-7B
parameters:
weight: 0.2
density: 0.55
# Vicuna format
- model: AbacusResearch/haLLAwa3
parameters:
weight: 0.3
density: 0.55
merge_method: dare_ties
base_model: eren23/ogno-monarch-jaskier-merge-7b
parameters:
int8_mask: true
dtype: bfloat16
random_seed: 0
\```