--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - BeaverAI/mistral-doryV2-12b - intervitens/mini-magnum-12b-v1.1 - grimjim/mistralai-Mistral-Nemo-Base-2407 - not-for-all-audiences --- # NemoMix-12B-DellaV1b NNemoMix-12B-DellaV1b is an *experimental* merge of the following models using the [della_linear](https://arxiv.org/abs/2406.11617) method using [mergekit](https://github.com/cg123/mergekit): * [BeaverAI/mistral-doryV2-12b](https://huggingface.co/BeaverAI/mistral-doryV2-12b) * [intervitens/mini-magnum-12b-v1.1](https://huggingface.co/intervitens/mini-magnum-12b-v1.1) * [grimjim/mistralai-Mistral-Nemo-Base-2407](https://huggingface.co/grimjim/mistralai-Mistral-Nemo-Base-2407) This merge works well, but is very horny. Extremely NSFW. ## 🧩 Configuration ```yaml models: - model: BeaverAI/mistral-doryV2-12b parameters: weight: 0.30 density: 0.42 - model: intervitens/mini-magnum-12b-v1.1 parameters: weight: 0.35 density: 0.66 - model: grimjim/mistralai-Mistral-Nemo-Base-2407 parameters: weight: 0.35 density: 0.78 merge_method: della_linear base_model: grimjim/mistralai-Mistral-Nemo-Base-2407 parameters: int8_mask: true normalize: true epsilon: 0.1 lambda: 1.0 density: 0.7 dtype: bfloat16 ```