gpt2-4k / README.md
naxalpha's picture
add read me
113af5d
|
raw
history blame
167 Bytes
# GPT-2 (125M) 4k tokens
Fine-tuned GPT2 Smallest model on The Pile with a token length of 4k.
Weights are included and it follows Karpathy's nanoGPT implementation.