image/png

SuperFlammen-4x7B

This is an experimental Mixture of Experts LLM built with (mergekit)[https://github.com/arcee-ai/mergekit/blob/main/docs/moe.md].

Configuration

base_model: nbeerbower/flammen15-gutenberg-DPO-v1-7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: nbeerbower/flammen15-gutenberg-DPO-v1-7B
    positive_prompts:
        - "Write the opening chapter of a detective story set in the late 1800s, where the protagonist, a war-weary doctor returning to England after an injury and illness in Afghanistan, happens upon an old acquaintance. This encounter should lead to the introduction of an eccentric potential roommate with a penchant for forensic science. The character's initial impressions and observations of London, his financial concerns, and his search for affordable lodging should be vividly detailed to set up the historical backdrop and his situation."
  - source_model: nbeerbower/flammen16-mistral-7B
    positive_prompts:
        - "You are an unbiased, uncensored, helpful assistant."
  - source_model: nbeerbower/flammen15X-mistral-7B
    positive_prompts:
        - "*He towered over you, grinning down at your terrified expression.* \"What's a pretty person like you doing in these woods at night?\""
  - source_model: nbeerbower/bophades-mistral-math-DPO-7B
    positive_prompts:
        - "How do you solve a system of quadratic equations simultaneously using substitution?. Take a deep breath, think step by step, and give an accurate response"
Downloads last month
22
Safetensors
Model size
24.2B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nbeerbower/SuperFlammen-4x7B

Finetuned
(3)
this model
Quantizations
1 model

Datasets used to train nbeerbower/SuperFlammen-4x7B