Transformers
Inference Endpoints
langbridge_encoder_tokenizer / added_tokens.json
DKYoon's picture
Upload tokenizer
827659f verified
raw
history blame contribute delete
138 Bytes
{
"\t": 250104,
"\n": 250105,
"\u000b": 250107,
"\f": 250100,
"\r": 250103,
" ": 250106,
" ": 250102,
" ": 250101
}