Plans to go beyond 4k context length?

#2
by lazyDataScientist - opened

I love the writing style of this model but extra context window is becoming increasingly important. I am lucky to get 2k tokens before I run into truncation issues.

Sign up or log in to comment