galactica-6.7b-finetuned / tokenizer_config.json
sanagnos's picture
Upload tokenizer
d86db70
raw
history blame contribute delete
243 Bytes
{
"model_max_length": 1000000000000000019884624838656,
"name_or_path": "/local/home/sanagnos/tmp/checkpoint-2500",
"special_tokens_map_file": "/content/tokenizer/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}