File size: 1,120 Bytes
e72a138
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f13efec
e72a138
 
 
4968e73
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: apache-2.0
language:
- en
datasets:
- c4
---

# mpt-125m-c4

## Model Description

Pretrained model for MPT-125M trained on C4 dataset

## Training data

Trained on HuggingFace C4 dataset

## Training procedure

This model was trained on C4 for ~2.5B tokens. Training time was ~1 hour with 104 A100-40gb GPUs.

## Intended Use and Limitations

This model is primarily for generating texts from a prompt. The purpose is to explore pretraining models for research.
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_wtang06__mpt-125m-c4)

| Metric                | Value                     |
|-----------------------|---------------------------|
| Avg.                  | 17.2   |
| ARC (25-shot)         | 22.7          |
| HellaSwag (10-shot)   | 25.04    |
| MMLU (5-shot)         | 23.12         |
| TruthfulQA (0-shot)   | 0.0   |
| Winogrande (5-shot)   | 49.57   |
| GSM8K (5-shot)        | 0.0        |
| DROP (3-shot)         | 0.0         |