Ollama Support
#3
by
Olshansky
- opened
Request: Support for pulling this model directly into my ollama enviornment.
Note: I want to start by saying that I don't fully know all the details and complexities of this request.
Context:
- Huggingface has native ollama integration [1]
- Llama3.2-vision (the censored version) is already on ollama [2]
- My understanding is that this is llama.cpp compatible, which is what ollama is built on top of.
Question: How easy is it to do this?
@Guilherme34 Happy to help make it happen!
[1] https://huggingface.co/docs/hub/en/ollama
[2] The llama3.2-vision library