About

ape fiction: The ape stands for Algorithmic Pattern Emulation. I am finetuning this to be used in generating fiction.

Finetuned from Mistral Nemo Base using the fullfictions-85kmax dataset.

This uses about the biggest context size entries in the dataset that I've been able to train without OOM errors.

I used unsloth to do the finetuning on a rented GPU (H100) for 2 epochs. Thanks to everybody who made this possible: the unsloth brothers, folks behind KoboldCPP, team behind Mistral Nemo, organizers and volunteers from Gutenberg.org... there's probably more. Thanks everybody.

Downloads last month
4
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for leftyfeep/ape-fiction

Quantizations
2 models