Serving this with llama.cpp

#1
by qiisziilbash - opened

Hi,

Is there an example of using this model? in particular with llama.cpp?

Thanks

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment