Does not run in Oobabooga?
#5
by
AIGUYCONTENT
- opened
I have 120GB of VRAM. Tried to load this model and got:
ValueError: Unknown quantization type, got exl2 - supported types are: ['awq', 'bitsandbytes_4bit', 'bitsandbytes_8bit', 'gptq', 'aqlm', 'quanto', 'eetq', 'hqq', 'fbgemm_fp8']
I know that oobabooga supports exl2. Is this an ooabooga error or?
You need to use the exllamav2 or exllamav2_HF loader