metadata
base_model:
- smelborp/MixtralOrochi8x7B
library_name: transformers
tags:
- mergekit
- merge
maid-yuzu-v5-mix
This is a merge of pre-trained language models created using mergekit.
This model was created because I was curious about whether the 8X7B model created randomly by the user would be merged with other existing 8x7b models.
Merge Details
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
- ../maid-yuzu-v5
- smelborp/MixtralOrochi8x7B
Configuration
The following YAML configuration was used to produce this model:
base_model:
model:
path: ../maid-yuzu-v5
dtype: bfloat16
merge_method: slerp
parameters:
t:
- value: 0.5
slices:
- sources:
- layer_range: [0, 32]
model:
model:
path: smelborp/MixtralOrochi8x7B
- layer_range: [0, 32]
model:
model:
path: ../maid-yuzu-v5