Edit model card

EXL2 Quantization of l2-13b-thespurral-v1.

GGUF here

Model details

Quantized at 5.33bpw and 6.13bpw

Visit the model repo for more details.

cato

Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including R136a1/l2-13b-thespurral-v1-exl2