Simple library checkpoint for being optimized enough to train a MLP with ~11.100 Parameters
in 16s
on a six-core 4GHz AMD Ryzen.
Trained for two epochs on the entire Tiny-Shakespeare dataset. The model operates character level, with a context length of 128 characters and a propability distribution of 255 characters as its output.
Output text the model 'predicts' doesn't make sense, for more information visit the GitHub Repo for this library and view the main.cpp file.
PhantasiaAI is live now, check it out on
our webpage.