FMixIA-7B-TIES-1

A merged model using Task Interpolation for Efficient Scaling (TIES) using mergekit.

Model Details

Configuration

models:
  - model: eren23/OGNO-7b-dpo-truthful
  - model: Kquant03/NeuralTrix-7B-dpo-laser
    parameters:
      density: 0.5
      weight: 0.5
merge_method: ties
base_model: eren23/OGNO-7b-dpo-truthful
parameters:
  normalize: true
dtype: float16

Usage

This model can be used with the standard transformers library:

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Ro-xe/FMixIA-7B-TIES-1")
tokenizer = AutoTokenizer.from_pretrained("Ro-xe/FMixIA-7B-TIES-1")
Downloads last month
6
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.