Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

Exl2 of an old model. Found it still really good and I'm still using it sometimes so 4.6bpw exl2 here is.

Only support 8k (8192) context length...

Alpaca prompting format.

Downloads last month
11
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.