Zebrafish-linear-7B / mergekit_config.yml
mlabonne's picture
Upload folder using huggingface_hub
dbff241 verified
raw
history blame contribute delete
289 Bytes
models:
- layer_range: [0, 40]
model: liminerity/M7-7b
parameters:
weight: 0.5
- layer_range: [0, 40]
model: rwitz/experiment26-truthy-iter-0
parameters:
weight: 0.5
merge_method: task_arithmetic
base_model: liminerity/M7-7b
dtype: bfloat16
random_seed: 0