Requested tokens (13223) exceed context window of 4096
#10
by
raghusri
- opened
i'm receiving an error along these lines when trying to query a local db with metdata that i have added.
File "/Users/raghiramontisrinivasan/Library/Python/3.9/lib/python/site-packages/llama_cpp/llama.py", line 1474, in create_completion
completion: Completion = next(completion_or_chunks) # type: ignore
File "/Users/raghiramontisrinivasan/Library/Python/3.9/lib/python/site-packages/llama_cpp/llama.py", line 953, in _create_completion
raise ValueError(
ValueError: Requested tokens (13223) exceed context window of 4096
Could you please help me understand this error?
This model can't fit your use case. your DB schema is too big, the context window of this model is 4096.