t5_l12_large_dataset / special_tokens_map.json
axiado's picture
add tokenizer
0f7d383
raw
history blame
112 Bytes
{"eos_token": "[end]", "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "mask_token": "[MASK]"}