--- base_model: - iRyanBell/ARC1 - HaitameLaf/Llama-3-8B-StoryGenerator - MrRobotoAI/llama3-8B-Special-Dark-v2.0 - iRyanBell/ARC1-II - lemon07r/Llama-3-RedMagic2-8B - grimjim/Llama-3-Perky-Pat-Instruct-8B - HPAI-BSC/Llama3-Aloe-8B-Alpha library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [MrRobotoAI/llama3-8B-Special-Dark-v2.0](https://huggingface.co/MrRobotoAI/llama3-8B-Special-Dark-v2.0) as a base. ### Models Merged The following models were included in the merge: * [iRyanBell/ARC1](https://huggingface.co/iRyanBell/ARC1) * [HaitameLaf/Llama-3-8B-StoryGenerator](https://huggingface.co/HaitameLaf/Llama-3-8B-StoryGenerator) * [iRyanBell/ARC1-II](https://huggingface.co/iRyanBell/ARC1-II) * [lemon07r/Llama-3-RedMagic2-8B](https://huggingface.co/lemon07r/Llama-3-RedMagic2-8B) * [grimjim/Llama-3-Perky-Pat-Instruct-8B](https://huggingface.co/grimjim/Llama-3-Perky-Pat-Instruct-8B) * [HPAI-BSC/Llama3-Aloe-8B-Alpha](https://huggingface.co/HPAI-BSC/Llama3-Aloe-8B-Alpha) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: lemon07r/Llama-3-RedMagic2-8B parameters: weight: 0.1429 density: 0.9 - model: grimjim/Llama-3-Perky-Pat-Instruct-8B parameters: weight: 0.1429 density: 0.9 - model: HaitameLaf/Llama-3-8B-StoryGenerator parameters: weight: 0.1429 density: 0.9 - model: HPAI-BSC/Llama3-Aloe-8B-Alpha parameters: weight: 0.1429 density: 0.9 - model: iRyanBell/ARC1 parameters: weight: 0.1429 density: 0.9 - model: iRyanBell/ARC1-II parameters: weight: 0.1429 density: 0.9 - model: MrRobotoAI/llama3-8B-Special-Dark-v2.0 parameters: weight: 0.1429 density: 0.9 merge_method: dare_ties base_model: MrRobotoAI/llama3-8B-Special-Dark-v2.0 parameters: normalize: true int8_mask: true dtype: float16 ```