Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
peteparker456
/
my-new-tokenizer
like
0
Transformers
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
my-new-tokenizer
1 contributor
History:
2 commits
peteparker456
Upload tokenizer
1cfe633
verified
20 days ago
.gitattributes
Safe
1.52 kB
initial commit
20 days ago
README.md
Safe
5.17 kB
Upload tokenizer
20 days ago
merges.txt
754 Bytes
Upload tokenizer
20 days ago
special_tokens_map.json
Safe
99 Bytes
Upload tokenizer
20 days ago
tokenizer.json
13.9 kB
Upload tokenizer
20 days ago
tokenizer_config.json
Safe
471 Bytes
Upload tokenizer
20 days ago
vocab.json
3.58 kB
Upload tokenizer
20 days ago