ffxiv-ja-ko-translator / trg_tokenizer /tokenizer_config.json
tikim
Add data of tokenizers
0066a39
raw
history blame contribute delete
251 Bytes
{
"bos_token": "</s>",
"eos_token": "</s>",
"mask_token": "<mask>",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "<pad>",
"special_tokens_map_file": null,
"tokenizer_class": "GPT2Tokenizer",
"unk_token": "<unk>"
}