--- base_model: [] library_name: transformers tags: - mergekit - merge --- # nemo-12b-rp-merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della_linear merge method using /www/mistralai/Mistral-Nemo-Base-2407 as a base. ### Models Merged The following models were included in the merge: * /www/nemo-12b-rp/checkpoint-154 * /www/mistralai/Mistral-Nemo-Instruct-2407 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /www/nemo-12b-rp/checkpoint-154 parameters: weight: 0.3 density: 0.5 - model: /www/mistralai/Mistral-Nemo-Instruct-2407 parameters: weight: 0.7 density: 0.8 merge_method: della_linear base_model: /www/mistralai/Mistral-Nemo-Base-2407 parameters: epsilon: 0.05 lambda: 1 int8_mask: true dtype: bfloat16 ```