ZEUS 8B V30

This model is a merge of the following pre-trained and finetuned LLMs, created using mergekit.

Merge Configuration

The following YAML configuration was used to produce this model:

base_model: T145/KRONOS-8B-V1-P1
dtype: bfloat16
merge_method: dare_ties
name: ZEUS-8B-V30
parameters:
  int8_mask: 1.0
  normalize: 1.0
  random_seed: 145
slices:
- sources:
  - layer_range: [0, 32]
    model: unsloth/Llama-3.1-Storm-8B
    parameters:
      density: 0.94
      weight: 0.35
  - layer_range: [0, 32]
    model: arcee-ai/Llama-3.1-SuperNova-Lite
    parameters:
      density: 0.92
      weight: 0.26
  - layer_range: [0, 32]
    model: VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct
    parameters:
      density: 0.91
      weight: 0.2
  - layer_range: [0, 32]
    model: Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
    parameters:
      density: 0.93
      weight: 0.19
  - layer_range: [0, 32]
    model: T145/KRONOS-8B-V1-P1
tokenizer:
  source: union
  tokens:
    <|begin_of_text|>:
      force: true
      source: T145/KRONOS-8B-V1-P1
    <|eot_id|>:
      force: true
      source: T145/KRONOS-8B-V1-P1

Open LLM Leaderboard Evaluation Results

Detailed results can be found here! Summarized results can be found here!

Metric Value (%)
Average 28.86
IFEval (0-Shot) 74.36
BBH (3-Shot) 32.19
MATH Lvl 5 (4-Shot) 14.43
GPQA (0-shot) 9.40
MuSR (0-shot) 10.07
MMLU-PRO (5-shot) 32.71
Downloads last month
18
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for T145/ZEUS-8B-V30

Evaluation results