license: apache-2.0 | |
language: | |
- en | |
datasets: | |
- c4 | |
# mpt-125m-c4 | |
## Model Description | |
Pretrained model for MPT-125M trained on C4 dataset | |
## Training data | |
Trained on HuggingFace C4 dataset | |
## Training procedure | |
This model was trained on C4 for ~2.5B tokens. Training time was ~6 hours with 1 A100-80gb GPU. | |
## Intended Use and Limitations | |
This model is primarily for generating texts from a prompt. The purpose is to explore pretraining models for research. |