sample
can you also upload a proper random sample (from across all files)?, small just for analysis
Hey @KnutJaegersberg ! Once this dataset is done uploading, we are planning on uploading a 1T token random sample that is partitioned by Free Decimal Correspondence level 2. We are happy to upload a 100B token (or smaller) random sample if it would be useful.
hi @Research-EAI it would be super beneficial to have samplings across 1B, 10B, 100B and 1T on perhaps a separate dataset as well as this one
It seems as though the size of this dataset causes issues with the dataset viewers, etc :)
I'd like smaller samples as well. 10b is a handy size to peek into.
@KnutJaegersberg https://huggingface.co/collections/sumuks/essentialweb-v10-samples-6865f5be22dc5833f55d762d you may be interested in this
@sumuks thanks for replying, I was asking since I am looking to sample various pretraining datasets at various sizes to compare (like FineWeb, DCLM) and was looking at what can be the best approach that doesn’t require processing the full dataset but maintains the distribution in terms of instance length.
unfortunately, i didn’t take into account the instance length, just the temporality by sampling the different parquet files at random inside each snapshot
@sumuks I have released the datasets I sampled and added yours as well to a collection here - https://huggingface.co/collections/codelion/pre-training-dataset-samples-686bd760abf1a43b0ce32829