6.5 bpw EXL2 quant of Acolyte-22B


Acolyte-22B

image/png

LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size. Check the LoRA for dataset info.

Use Mistral V2 & V3 template.

Downloads last month
8
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for Brioch/Acolyte-22B-6.5bpw-exl2

Quantized
(7)
this model