Short context? why?

#1
by d00mus - opened

Both the parent models have longer context, up to 128k but this one is only 32k. which is really disappointing..Is it possible to fix that?

This comment has been hidden

Just re-checked..Qwen says that both, QwQ and Coder-instruct have 128k context.. But you mentioned Just Coder(no instruct?).. Actually, "no instruct" also has 128k..So it's really should not be a problem..

@d00mus The problem has been solved. Thank you for your feedback. The context has been changed to 128K!

@mradermacher Now that the problem of the context length has been solved, could you please provide the quantized version of the model again? Thank you so much for your help!

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment