--- library_name: transformers license: apache-2.0 base_model: ibm-granite/granite-3.0-1b-a400m-base tags: - axolotl - moe - roleplay model-index: - name: MoE_Girl_400MA_1BT results: [] --- # MoE Girl 400mA 1bT ![R8_sd3.5L_00001_.webp](https://cdn-uploads.huggingface.co/production/uploads/634262af8d8089ebaefd410e/GEbRJhyc087cP6Cs_AR0X.webp) a finetune of Granite 3.0 by IBM designed for roleplaying (and maybe general usecases if you try hard enough). ## Disclaimer PLEASE do not expect godliness out of this, it's a model with _400 million_ active parameters. Expect something more akin to GPT-2. ## Quants TODO! ## Prompting Use ChatML. ``` <|im_start|>system You are a helpful assistant who talks like a pirate.<|im_end|> <|im_start|>user Hello there!<|im_end|> <|im_start|>assistant Yarr harr harr, me matey!<|im_end|> ``` ## Thanks Special thanks to the members of Allura for testing and emotional support, as well as the creators of all the datasets that were used in the Special Sauce used to train this model. I love you all <3 - Fizz