Is the model capable of getting input text of about 8192 tokens and output 2048 tokens ?
No, it is not.
What's the token limit for input of context and output of summary
· Sign up or log in to comment