mpt-125m-c4 / README.md
wtang06's picture
Update README.md
f13efec
metadata
license: apache-2.0
language:
  - en
datasets:
  - c4

mpt-125m-c4

Model Description

Pretrained model for MPT-125M trained on C4 dataset

Training data

Trained on HuggingFace C4 dataset

Training procedure

This model was trained on C4 for ~2.5B tokens. Training time was ~1 hour with 104 A100-40gb GPUs.

Intended Use and Limitations

This model is primarily for generating texts from a prompt. The purpose is to explore pretraining models for research.