ggml-vic13b-uncensored-q5_1.bin and ggml-vic13b-uncensored-q8_0.bin throw errors in newest oobabooga-webui
#14
by
RandomLegend
- opened
Ensure you're not using q5_x (oobabooga wasn't updated to use those yet). If you use 4_2 or 4_3, it should work after you update your oobabooga.
pip install llama-cpp-python==0.1.39