language: | |
- ru | |
tags: | |
- PyTorch | |
- Transformers | |
thumbnail: "https://github.com/sberbank-ai/ru-gpts" | |
# rugpt2large | |
Model was trained with sequence length 1024 using transformers by [SberDevices](https://sberdevices.ru/) team on 170Gb data on 64 GPUs 3 weeks. | |