(GGUF) Thanks:

HumanBoiii


Mythorica - RP model designed for generating vivid storytelling, engaging dialogues, and immersive world-building. Inspired by the fusion of fantasy and realism, Mythorica excels at crafting intricate narratives and breathing life into characters, making it a versatile choice for writers, roleplayers.


Merge Method

This is a merge of pre-trained language models created using mergekit.

This model was merged using the DARE TIES.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce Mythorica:

models:
  - model: ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9
    parameters:
      weight: 0.5
      density: 0.8
  - model: Arkana08/LexiMaid-L3-8B
    parameters:
      weight: 0.3
      density: 0.7
  - model: Sao10K/L3-8B-Chara-v1-Alpha
    parameters:
      weight: 0.2
      density: 0.75
merge_method: dare_ties
base_model: ChaoticNeutrals/Hathor_Tahsin-L3-8B-v0.9
parameters:
  int8_mask: true
dtype: bfloat16

Credits

Thanks to the creators of the models:

Downloads last month
6
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Arkana08/Mythorica-L3-8B

Merges
1 model
Quantizations
3 models