Spaces:
Runtime error
Runtime error
Issue - completely and totally broken.
#1
by
TheSystemGuy
- opened
Never in my life would I post a literal GitHub issue here. But here we are. The whole thing is broken; any prompt will exceed the token limit and crash the whole thing. The error in question is like this:
Error in generating model output: litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - Error code: 400 - {'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 236348 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
I understand this was done as fast as possible.
This comment has been hidden (marked as Off-Topic)