future research

#2
by NafishZaldinanda - opened

Will it be available for Q4_K_M in the future and can it be run via llama-cpp-python and provide mmproject for chathandler in llama-cpp-python?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment