Transformers
GGUF
English
mergekit
Merge
alpaca
mistral
Not-For-All-Audiences
nsfw
Inference Endpoints
imatrix
Would it be possible to make a 7.5 or 8.0 bpw exl2 quant of this model?
#1
by
mjh657
- opened
Just wondering.
Almost certainly, but I lack the expertise to do that. You could ask @LoneStriker or @bartowski for example, they regularly provide exl2's.
mradermacher
changed discussion status to
closed