gpt3-kor-small_based_on_gpt2 / tokenizer_config.json
kykim's picture
Update tokenizer_config.json
cfe4828
raw
history blame contribute delete
120 Bytes
{
"do_lower_case": true,
"strip_accents": false,
"model_max_length": 2048,
"tokenizer_class": "BertTokenizer"
}