OSError: It looks like the config file at ‘models/nous-hermes-llama2-70b.Q5_K_M.gguf’ is not a valid JSON file
#1
by
almanshow
- opened
Downloaded the model in text-generation-webui/models (oogabooga web ui).
Starting server with python server.py --n-gpu-layers 1000.
When loading the model, i get following error:
OSError: It looks like the config file at 'models/nous-hermes-llama2-70b.Q5_K_M.gguf' is not a valid JSON file.
It added support yesterday, but via the new ctransformers backend, not yet via llama-cpp-python backend. Switch to using ctransformers in text-gen and I believe it should work.
That said, llama-cpp-python also added support yesterday so I'm sure text-gen will update that soon as well.