blackmount8/vicuna-13b-v1.5-int8

Int8 version of lmsys/vicuna-13b-v1.5, quantized using CTranslate2.

Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API has been turned off for this model.