--- base_model: [] tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [DiscoResearch/DiscoLM_German_7b_v1](https://huggingface.co/DiscoResearch/DiscoLM_German_7b_v1) * [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: OpenPipe/mistral-ft-optimized-1227 layer_range: [0, 32] - model: DiscoResearch/DiscoLM_German_7b_v1 layer_range: [0, 32] merge_method: slerp base_model: OpenPipe/mistral-ft-optimized-1227 parameters: t: - value: [0.5, 0.9] dtype: bfloat16 ``` This settings are from the model [oshizo/japanese-e5-mistral-7b_slerp](https://huggingface.co/oshizo/japanese-e5-mistral-7b_slerp).