t5_l12_large_dataset / tokenizer_config.json
axiado's picture
add tokenizer
0f7d383
raw
history blame
305 Bytes
{"max_len": 256, "name_or_path": "SharpAI/mal-tls-bert-base", "special_tokens_map_file": "/root/.cache/huggingface/transformers/85a76eea59fe40ae80bc50b05c4fe93e7547727086c4a19787726e35a451f9fd.45ed21ffc69cb3eceab51050529cfc4e1b82b5f17027779bf75c6eacc17a5079", "tokenizer_class": "PreTrainedTokenizerFast"}