mistral-7b-merged-passthrough

mistral-7b-merged-passthrough is a merge of the following models:

🧩 Configuration

slices:
  - sources:
    - model: OpenPipe/mistral-ft-optimized-1218
      layer_range: [0, 32]
  - sources:
    - model: mlabonne/NeuralHermes-2.5-Mistral-7B
      layer_range: [24, 32]
merge_method: passthrough
dtype: bfloat16
Downloads last month
4
Safetensors
Model size
8.99B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including mychen76/mistral-7b-merged-passthrough