tau
/

Transformers
PyTorch
English
tau/sled
Inference Endpoints
bart-large-sled-govreport / tokenizer_config.json
maorivgi
initial commit
d01ed1c
raw
history blame contribute delete
112 Bytes
{
"tokenizer_class": "SledTokenizer",
"base_tokenizer": "facebook/bart-large",
"model_max_length": 16384
}