Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -9,6 +9,8 @@ Gemstones Training Dataset - Sequential version
|
|
9 |
|
10 |
This data is a reporcessed version of the first 1B rows of the Dolma v1.7 dataset (https://huggingface.co/datasets/allenai/dolma).
|
11 |
|
|
|
|
|
12 |
**Disclaimer:** this is an approximation of the dataset used to train the Gemstones model suite.
|
13 |
Due to the randomized and sharded nature of the distributed training code, the only way to perfectly
|
14 |
reproduce the training batches across the gpus is/was the run the training code.
|
|
|
9 |
|
10 |
This data is a reporcessed version of the first 1B rows of the Dolma v1.7 dataset (https://huggingface.co/datasets/allenai/dolma).
|
11 |
|
12 |
+
The data is encoded using the Pythia tokenizer: https://huggingface.co/EleutherAI/pythia-160m
|
13 |
+
|
14 |
**Disclaimer:** this is an approximation of the dataset used to train the Gemstones model suite.
|
15 |
Due to the randomized and sharded nature of the distributed training code, the only way to perfectly
|
16 |
reproduce the training batches across the gpus is/was the run the training code.
|