Load model from huggingface

#16
by HemanthSai7 - opened

I want to use this version of the model.
https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/blob/main/mistral-7b-instruct-v0.1.Q4_K_M.gguf

What path should I specify in Langchain since I can't download and run the model due to low resources?

I don't think that's doable.

deleted

gotta have a place its running, just having the file path isn't going to do it for you.

HemanthSai7 changed discussion status to closed

Sign up or log in to comment