merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using IntervitensInc/Mistral-Nemo-Base-2407-chatml as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


models:
  - model: Delta-Vector/Francois-PE-V2-Huali-12B
    parameters:
      density: 0.9
      weight: 1
  - model: DoppelReflEx/MN-12B-Mimicore-GreenSnake
    parameters:
      density: 0.6
      weight: 0.8
merge_method: dare_ties
base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
tokenizer_source: base
parameters:
  rescale: true
dtype: bfloat16
Downloads last month
22
Safetensors
Model size
12.2B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DoppelReflEx/Lilithcore-v2-test