Proto-Harpy-Avian-v0.1-7B / mergekit_config.yml
Jacoby746's picture
Upload folder using huggingface_hub
0fb3218 verified
raw
history blame contribute delete
597 Bytes
models:
- model: C:/Users/Jacoby/Downloads/text-generation-webui-main/models/BAAI_Infinity-Instruct-7M-Gen-mistral-7B
- model: C:/Users/Jacoby/Downloads/text-generation-webui-main/models/SanjiWatsuki_Kunoichi-7B
merge_method: slerp
base_model: C:/Users/Jacoby/Downloads/text-generation-webui-main/models/SanjiWatsuki_Kunoichi-7B
parameters:
t:
- value: [0, 0, 0.2, 0.3, 0.4, 0.5, 0.4, 0.3, 0.2, 0, 0] # Preserving the first and last layers of Miqu untouched is key for good results
embed_slerp: true # This is super important otherwise the merge will fail
dtype: float16