--- base_model: - AXCXEPT/EZO-Humanities-9B-gemma-2-it - IlyaGusev/gemma-2-9b-it-abliterated - AXCXEPT/EZO-Common-9B-gemma-2-it - lemon07r/Gemma-2-Ataraxy-v4d-9B library_name: transformers tags: - mergekit - merge --- ![image/png](https://huggingface.co/yamatazen/Irida-SCE-9B/resolve/main/Irida-SCE-9B.png?download=true) # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [IlyaGusev/gemma-2-9b-it-abliterated](https://huggingface.co/IlyaGusev/gemma-2-9b-it-abliterated) as a base. ### Models Merged The following models were included in the merge: * [AXCXEPT/EZO-Humanities-9B-gemma-2-it](https://huggingface.co/AXCXEPT/EZO-Humanities-9B-gemma-2-it) * [AXCXEPT/EZO-Common-9B-gemma-2-it](https://huggingface.co/AXCXEPT/EZO-Common-9B-gemma-2-it) * [lemon07r/Gemma-2-Ataraxy-v4d-9B](https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4d-9B) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: IlyaGusev/gemma-2-9b-it-abliterated models: - model: lemon07r/Gemma-2-Ataraxy-v4d-9B - model: AXCXEPT/EZO-Common-9B-gemma-2-it - model: AXCXEPT/EZO-Humanities-9B-gemma-2-it merge_method: sce dtype: bfloat16 parameters: normalize: true select_topk: 0.5 ```