KytheraMix-7B-v0.2 / README.md
sometimesanotion's picture
Update README.md
1b34d53 verified
metadata
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0
base_model:
  - jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0
  - jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.8
  - jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9
  - jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1
  - fblgit/cybertron-v4-qw7B-UNAMGS
  - sethuiyer/Qwen2.5-7B-Anvita
  - Qwen/Qwen2.5-7B-Instruct
language:
  - en

Kythera.webp

KytheraMix-7B is crafted using semi-automated merges YAML templates. As with AgoraMix, two DELLA merge trees converge: one for instruction following, and one for reason. A SLERP merge blends them with a gradient, and a TIES merge normalizes the weights.

Ancestor Models