merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using mistralai/Mistral-7B-Instruct-v0.2 as a base.
Models Merged
The following models were included in the merge:
- mlabonne/NeuralBeagle14-7B
- Nexusflow/Starling-LM-7B-beta
- CorticalStack/pastiche-crown-clown-7b-dare-dpo
Configuration
The following YAML configuration was used to produce this model:
base_model: mistralai/Mistral-7B-Instruct-v0.2
dtype: bfloat16
merge_method: dare_ties
models:
- model: mistralai/Mistral-7B-Instruct-v0.2
- model: Nexusflow/Starling-LM-7B-beta
parameters:
density: '0.53'
weight: '0.4'
- model: mlabonne/NeuralBeagle14-7B
parameters:
density: '0.53'
weight: '0.3'
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
parameters:
density: '0.53'
weight: '0.3'
parameters:
int8_mask: true
- Downloads last month
- 12
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for jambroz/FNCARL-7b
Merge model
this model