metadata
base_model:
- sethuiyer/Medichat-Llama3-8B
- mlabonne/ChimeraLlama-3-8B-v3
- johnsnowlabs/JSL-MedLlama-3-8B-v2.0
library_name: transformers
tags:
- mergekit
- merge
medLlama-3-8B_DARE
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using mlabonne/ChimeraLlama-3-8B-v3 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: mlabonne/ChimeraLlama-3-8B-v3
# No parameters necessary for base model
- model: sethuiyer/Medichat-Llama3-8B
parameters:
density: 0.53
weight: 0.5
- model: johnsnowlabs/JSL-MedLlama-3-8B-v2.0
parameters:
density: 0.53
weight: 0.5
merge_method: dare_ties
base_model: mlabonne/ChimeraLlama-3-8B-v3
parameters:
int8_mask: true
dtype: float16