--- license: apache-2.0 base_model: - unsloth/Mistral-Small-24B-Base-2501 ---  # MS-24B-Instruct-Mullein-v0
V0 note from Severian: This instruct variant is tamer and less unhinged than the base version, losing some ability to characterize NPCs but with further improved char/scenario portrayal, a tradeoff of sorts. We couldn't actually decide what to put out, because both are fun and good in their own way.
Let us know what you think, we're looking forward to seeing people test it.
## Big Thanks The folks in the trashpanda and ArliAI discords for testing (In no particular order) The Allura folks for their [Sugarquill 10k dataset](https://huggingface.co/datasets/allura-org/sugarquill-10k) (which I lightly cleaned for stuff like unicode quotes) fizz for her [floyd-instruct](https://huggingface.co/datasets/estrogen/floyd-instruct), [woke-identity](https://huggingface.co/datasets/estrogen/woke-identity), and [benchmaxxing (lol)](https://huggingface.co/datasets/estrogen/gpqa-benchmaxxing) datasets Gryphe for their [Sonnet3.5 RP](https://huggingface.co/datasets/Gryphe/Sonnet3.5-Charcard-Roleplay?not-for-all-audiences=true) and [4o WP](https://huggingface.co/datasets/Gryphe/ChatGPT-4o-Writing-Prompts) datasets, which I heavily filtered for slop kalo's [Opus-22k](https://huggingface.co/datasets/anthracite-org/kalo-opus-instruct-22k-no-refusal) dataset, which was usable basically OOTB Norquinal for their [OpenCAI](https://huggingface.co/datasets/Norquinal/OpenCAI) dataset Dampfinchen for their [Creative Writing Multiturn](https://huggingface.co/datasets/Dampfinchen/Creative_Writing_Multiturn) dataset The Recursal folks for their [SCP wiki](https://huggingface.co/datasets/recursal/SCP-RECURSAL) dataset (we also used some other private datasets of our own) ## Reviews > Base is more unhinged but I see more slops. Would be interesting to see if a merge can balance it out in a good way > > Instruct gives me more swipes that I like, it's less horny but it can definitely cook during actual smut > > I still like instruct more I think, but I appreciate how unhinged base model can be lol — OMGWTFBBQ > Hard to send with one hand. What did you feed this model? — Myscell > It spoke to my body and soul. — Raihanbook > my cock twitched in interest, 10/10 model — AIELO > Reroll varies the response by a lot. It's giving Starcannon. — Sam > Tried the base version with my card. It's just a narrative card and the model makes the character portray right, it also mentions my persona detail often. — Azula ## Just us having fun, don't mind it Big thanks to the folks in the trashpanda-org discord for testing and sending over some logs!      ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [unsloth/Mistral-Small-24B-Instruct-2501](https://huggingface.co/unsloth/Mistral-Small-24B-Instruct-2501) as a base. ### Models Merged The following models were included in the merge: * [trashpanda-org/MS-24B-Mullein-v0](https://huggingface.co/trashpanda-org/MS-24B-Mullein-v0) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: unsloth/Mistral-Small-24B-Instruct-2501 - model: trashpanda-org/MS-24B-Mullein-v0 parameters: density: 1 weight: 1 merge_method: ties base_model: unsloth/Mistral-Small-24B-Instruct-2501 parameters: normalize: true dtype: bfloat16 ```