# GPT-2 (125M) 4k tokens Fine-tuned GPT2 Smallest model on The Pile with a token length of 4k. Weights are included and it follows Karpathy's nanoGPT implementation.