Llama-3-Steerpike-v1-OAS-8B / mergekit_config.yml
grimjim's picture
Initial comit
e01fdde
raw
history blame
615 Bytes
base_model: mlabonne/NeuralDaredevil-8B-abliterated
dtype: bfloat16
merge_method: task_arithmetic
slices:
- sources:
- layer_range: [0, 32]
model: mlabonne/NeuralDaredevil-8B-abliterated
- layer_range: [0, 32]
model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
parameters:
weight: 0.5
- layer_range: [0, 32]
model: Hastagaras/Halu-OAS-8B-Llama3
parameters:
weight: 0.2
- layer_range: [0, 32]
model: openlynn/Llama-3-Soliloquy-8B-v2
parameters:
weight: 0.03
- layer_range: [0, 32]
model: grimjim/llama-3-aaditya-OpenBioLLM-8B
parameters:
weight: 0.1