--- base_model: - ContactDoctor/Bio-Medical-Llama-3-8B - lightblue/suzume-llama-3-8B-japanese library_name: transformers tags: - mergekit - merge --- # final_model This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [lightblue/suzume-llama-3-8B-japanese](https://huggingface.co/lightblue/suzume-llama-3-8B-japanese) as a base. ### Models Merged The following models were included in the merge: * [ContactDoctor/Bio-Medical-Llama-3-8B](https://huggingface.co/ContactDoctor/Bio-Medical-Llama-3-8B) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: lightblue/suzume-llama-3-8B-japanese dtype: bfloat16 merge_method: dare_ties parameters: int8_mask: 1.0 normalize: 1.0 slices: - sources: - layer_range: [0, 8] model: ContactDoctor/Bio-Medical-Llama-3-8B parameters: density: 1.0 weight: 0.6133219045070127 - layer_range: [0, 8] model: lightblue/suzume-llama-3-8B-japanese parameters: density: 0.685860266951033 weight: 0.5895381594604311 - sources: - layer_range: [8, 16] model: ContactDoctor/Bio-Medical-Llama-3-8B parameters: density: 0.7392837955301343 weight: 0.3228829047267915 - layer_range: [8, 16] model: lightblue/suzume-llama-3-8B-japanese parameters: density: 1.0 weight: 0.6225596018347737 - sources: - layer_range: [16, 24] model: ContactDoctor/Bio-Medical-Llama-3-8B parameters: density: 1.0 weight: 0.6675711396324198 - layer_range: [16, 24] model: lightblue/suzume-llama-3-8B-japanese parameters: density: 1.0 weight: 0.507981935427293 - sources: - layer_range: [24, 32] model: ContactDoctor/Bio-Medical-Llama-3-8B parameters: density: 0.7479105312794881 weight: 0.6307368863287528 - layer_range: [24, 32] model: lightblue/suzume-llama-3-8B-japanese parameters: density: 0.7322891014425874 weight: 0.633799814811044 ```