Thank you for access.
#4
by
jawad1347
- opened
Great to see innovations. Thankful for access, however, model quite large, can it be quantized or loaded in 4bits?
The same quantization methods that work for Llama models should also work for these models, but I'm afraid that's beyond scope for me ..
faaabian
changed discussion status to
closed
faaabian
changed discussion status to
open