Context Length?

#4
by lazyDataScientist - opened

I am guessing the context length is 4k tokens, but llama.cpp is suggesting 2k tokens. Just wanted to be sure what it is.

It should be native 4k yes!

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment