Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
ytzi
/
the-stack-dedup-gpt2
like
0
Modalities:
Tabular
Text
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
Community
1
main
the-stack-dedup-gpt2
/
lua
1 contributor
History:
1 commit
ytzi
pre-tokenized with gpt2-medium
253dd57
verified
about 1 year ago
train-00000-of-00019.parquet
142 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00001-of-00019.parquet
142 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00002-of-00019.parquet
151 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00003-of-00019.parquet
146 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00004-of-00019.parquet
141 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00005-of-00019.parquet
145 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00006-of-00019.parquet
141 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00007-of-00019.parquet
147 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00008-of-00019.parquet
153 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00009-of-00019.parquet
148 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00010-of-00019.parquet
144 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00011-of-00019.parquet
150 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00012-of-00019.parquet
143 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00013-of-00019.parquet
147 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00014-of-00019.parquet
141 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00015-of-00019.parquet
141 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00016-of-00019.parquet
147 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00017-of-00019.parquet
148 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago
train-00018-of-00019.parquet
142 MB
LFS
pre-tokenized with gpt2-medium
about 1 year ago