Gguf?

#2
by AlgorithmicKing - opened

if thats possible

Yea, I would like to use it with Ollama. Thanks!

@AlgorithmicKing working on this but my first doing converting gguf stuff(the gguf-my-repo doesn't support the model yet so manually it is)

Nope nvm, llama.cpp also doesn't support this architecture so no way to do it(yet)

GSAI-ML org

I'm extremely sorry. I'm not very familiar with all of this and I'm eagerly looking forward to more help from the community!

@nieshen there is a request over at the Llama.cpp Github page to add support for LLaDA.

https://github.com/ggml-org/llama.cpp/discussions/12208

Would appreciate if you implement it in llama.cpp, that would allow to run it on mobile devices such as Android or Iphones as well.

@cmp-nct here's the link:

https://github.com/ggml-org/llama.cpp/discussions/12208

Go to the llama.cpp and upvote my request and put in a comment.

Sign up or log in to comment