Spaces:
Running
on
A10G
Running
on
A10G
WhisperForConditionalGeneration is not supported
#177
by
trysem
- opened
Error converting to fp16: INFO:hf-to-gguf:Loading model: audioX-south-v1
INFO:hf-to-gguf:Model architecture: WhisperForConditionalGeneration
ERROR:hf-to-gguf:Model WhisperForConditionalGeneration is not supported
yes! it's not supported in llama.cpp at the moment.
reach-vb
changed discussion status to
closed