Self trained GPT-2 large. Around 770M parameters.

The tokenizer is the one from https://huggingface.co/openai-community/gpt2.

It is being trained on around 400B tokens and this is step 115k.

The evaluation is being conducted now.

License

This model is available under the Apache 2.0 License. Well, also MIT License. So both should be followed.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including DrNicefellow/GPT-2-Large-115k-steps