metadata
language:
- ru
tags:
- PyTorch
- Transformers
thumbnail: https://github.com/sberbank-ai/ru-gpts
rugpt2large
Model was trained with sequence length 1024 using transformers by SberDevices team on 170Gb data on 64 GPUs 3 weeks.
language:
- ru
tags:
- PyTorch
- Transformers
thumbnail: https://github.com/sberbank-ai/ru-gpts
Model was trained with sequence length 1024 using transformers by SberDevices team on 170Gb data on 64 GPUs 3 weeks.