Saiga merges
Collection
3 items
•
Updated
•
1
This is a merge of pre-trained language models created using mergekit.
This model was merged using the Model Breadcrumbs with TIES merge method using IlyaGusev/saiga_nemo_12b as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: Aleteian/Saiga-Unleashed
parameters:
density: 0.95
weight: 0.4
gamma: 0.01
- model: Aleteian/NeverendingStory
parameters:
density: 0.95
weight: 0.4
gamma: 0.01
- model: LatitudeGames/Wayfarer-12B
parameters:
density: 0.9
weight: 0.1
gamma: 0.01
- model: nbeerbower/Lyra-Gutenberg-mistral-nemo-12B
parameters:
density: 0.9
weight: 0.1
gamma: 0.01
merge_method: breadcrumbs_ties
base_model: IlyaGusev/saiga_nemo_12b
dtype: bfloat16
tokenizer_source: "union"
chat_template: "auto"