Context lenght question

#1
by mcDandy - opened

Does this model support the massive 32k context lenght (A bit rusty on my 2^k with k>13) as written on qwen2 github? Config file on non GUFF version tells 4096 tokens sliding window.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment