GGUF
Merge
Not-For-All-Audiences
Inference Endpoints

Good lord

#1
by Razrien - opened

I don't know what dark magic or witchcraft you used to make this thing, but it's quickly becoming a favorite. I'm still throwing cards/characters at it to see what it can do, but so far it's been consistently knocking it outta the park when it comes to staying in character, setting a scene, and anything else i've been asking it to do.
Very well done!

whew-hot.gif

Owner

Thank you :), next iteration - V3.0 is in tests, so far, so good.

I like it but is there a way to make it do 8k context? I'm running a 3090 24gb card, attempting to push it up to 8k crashes the loading with errors.

Owner

Maybe using ollama(gguf + custom modelfile), haven't tried it yet.

Sign up or log in to comment