--- base_model: - schonsense/Llama-3.3-70B-Inst-Ablit-Flammades-SLERP - Blackroot/Mirai-3.0-70B - Sao10K/L3.3-70B-Euryale-v2.3 - flammenai/Llama3.1-Flammades-70B - TsinghuaC3I/Llama-3-70B-UltraMedical library_name: transformers tags: - mergekit - merge --- # Flamlama_della_70B This model produces unique prose, but is virtually unuseable without babysitting each generation. It was created explicitly to add flavor and ability to a base model. This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della merge method using schonsense\Llama-3.3-70B-Inst-Ablit-Flammades-SLERP as a base. ### Models Merged The following models were included in the merge: * Blackroot\Mirai-3.0-70B * flammenai\Llama3.1-Flammades-70B * TsinghuaC3I\Llama-3-70B-UltraMedical * Sao10K\L3.3-70B-Euryale-v2.3 ### Configuration The following YAML configuration was used to produce this model: ```yaml name: Flam_della_70B models: - model: schonsense\Llama-3.3-70B-Inst-Ablit-Flammades-SLERP - model: Blackroot\Mirai-3.0-70B parameters: density: .8 weight: .8 - model: Sao10K\L3.3-70B-Euryale-v2.3 parameters: density: .8 weight: .8 - model: flammenai\Llama3.1-Flammades-70B parameters: density: .8 weight: .8 - model: TsinghuaC3I\Llama-3-70B-UltraMedical parameters: density: .8 weight: .8 merge_method: della base_model: schonsense\Llama-3.3-70B-Inst-Ablit-Flammades-SLERP tokenizer_source: schonsense\Llama-3.3-70B-Inst-Ablit-Flammades-SLERP parameters: normalize: true int8_mask: true lambda: 1.0 epsilon: 0.1 dtype: float16 ```