Please provide onnx model along with inference script.
#7 opened 3 days ago
by
SantoshHF
VRAM requirement
3
#5 opened 8 months ago
by
jithinmukundan
Can VLLM be used for loading?
6
#4 opened 9 months ago
by
wawoshashi
How many bits and what is the groupsize?
1
#3 opened 9 months ago
by
vitvit
What library was used to quantize the model?
1
#2 opened 9 months ago
by
KirillR
How to load command r+ in text-generation-webui?
5
#1 opened 9 months ago
by
MLDataScientist