Simple library checkpoint for being optimized enough to train a MLP with ~11.100 Parameters in 16s on a six-core 4GHz AMD Ryzen.
Trained for two epochs on the entire Tiny-Shakespeare dataset. The model operates character level, with a context length of 128 characters and a propability distribution of 255 characters as its output. Output text the model 'predicts' doesn't make sense, for more information visit the GitHub Repo for this library and view the main.cpp file.

PhantasiaAI is live now, check it out on our webpage.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train XeTute/Shakespeare-TextGen-MLP