Text Generation
Transformers
PyTorch
English
gpt_neox
text-generation-inference
Inference Endpoints

Model max length

#1
by Ejafa - opened

Is model max length parameter okay in tokenizer_config.json?

Ejafa changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment