FMixIA-7B-TIES-1
A merged model using Task Interpolation for Efficient Scaling (TIES) using mergekit.
Model Details
- Base Models:
- Merge Method: ties
Configuration
models:
- model: eren23/OGNO-7b-dpo-truthful
- model: Kquant03/NeuralTrix-7B-dpo-laser
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: eren23/OGNO-7b-dpo-truthful
parameters:
normalize: true
dtype: float16
Usage
This model can be used with the standard transformers library:
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Ro-xe/FMixIA-7B-TIES-1")
tokenizer = AutoTokenizer.from_pretrained("Ro-xe/FMixIA-7B-TIES-1")
- Downloads last month
- 5