metadata
base_model:
- Nisk36/finetuned-lmsys_vicuna-7b-v1.5
- Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
library_name: transformers
tags:
- mergekit
- merge
final_model
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the linear merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
merge_method: linear
parameters:
int8_mask: 1.0
normalize: 1.0
slices:
- sources:
- layer_range: [0, 4]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.6235769265047518
- layer_range: [0, 4]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.7274442555681364
- sources:
- layer_range: [4, 8]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.5271398694239577
- layer_range: [4, 8]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.3489250438855029
- sources:
- layer_range: [8, 12]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.15496421762028023
- layer_range: [8, 12]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.541330668871115
- sources:
- layer_range: [12, 16]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.5267269624685371
- layer_range: [12, 16]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.8265113027826562
- sources:
- layer_range: [16, 20]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.6599861585345389
- layer_range: [16, 20]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: -0.249060520039947
- sources:
- layer_range: [20, 24]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.7761318532349375
- layer_range: [20, 24]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.7040995904551324
- sources:
- layer_range: [24, 28]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: 0.40152017541360374
- layer_range: [24, 28]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.767141768059921
- sources:
- layer_range: [28, 32]
model: Nisk36/finetuned-lmsys_vicuna-7b-v1.5
parameters:
weight: -0.004536646708608122
- layer_range: [28, 32]
model: Nisk36/FT_elyza_ELYZA-japanese-Llama-2-7b-instruct
parameters:
weight: 0.8295357241419378