llama3-5.4b-instruct-unhealed / mergekit_config.yml
HaileyStorm's picture
Upload folder using huggingface_hub
8120487 verified
raw
history blame
300 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 16]
model: meta-llama/Meta-Llama-3-8B-Instruct
- sources:
- layer_range: [20, 21]
model: meta-llama/Meta-Llama-3-8B-Instruct
- sources:
- layer_range: [29, 32]
model: meta-llama/Meta-Llama-3-8B-Instruct