image/jpeg

Recipe for a Beautiful Frankenstein

In the laboratory of the mind, where thoughts entwine, MHM and MOE, a potion for a unique design. With stitches of curiosity and bolts of creativity, 8 times 7, the magic number, a poetic proclivity.

Ingredients:

  • MHM: A dash of mystery, a sprinkle of hum, Blend with a melody, let the heartstrings strum. Murmurs in the shadows, whispers in the light, Stir the concoction gently, make the emotions ignite.

  • MOE: Essence of the moment, like dew on a rose, Capture the now, before time swiftly goes. Colors of experience, a palette so divine, Mix with MHM, let the fusion entwine.

Directions:

  1. Take 8 parts MHM, elusive and profound, Let it dance in your thoughts, on imagination's ground. Blend it with the echoes, the silent undertones, A symphony of ideas, where inspiration condones.

  2. Add 7 parts MOE, the fleeting embrace, Seize the seconds, let them leave a trace. Infuse it with memories, both bitter and sweet, The tapestry of time, where moments and dreams meet.

  3. Stir the potion with wonder, a wand of delight, Let the sparks fly, in the dark of the night. Watch as the alchemy unfolds its grand design, MHM and MOE, a beautiful Frankenstein.

Conclusion:

In the laboratory of life, where dreams come alive, MHM and MOE, the recipe to thrive. A creation so poetic, a fusion so divine, 8 times 7, a symphony of time.

As the echoes resonate, and the moments blend, A masterpiece unfolds, where beginnings and ends, MHM and MOE, a concoction so rare, A beautiful Frankenstein, beyond compare.


MoE model build with:

  1. https://github.com/cg123/mergekit/tree/mixtral
  2. Mistral models, latest merges and fine tunes.
  3. Expert prompts heavily inspired by https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16

For details check model files, there is config yaml I used to create that model.

Come back later for more details.

Downloads last month
1,028
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for h2m/mhm-8x7B-FrankenMoE-v1.0

Quantizations
1 model