R136a1's picture
Create README.md
5569761
|
raw
history blame
724 Bytes
metadata
license: llama2
language:
  - en

EXL2 Quantization of l2-13b-thespurral-v1.

GGUF here

Model details

Quantized at 5.33bpw and 6.13bpw

This model is very good (at least for me) for role-playing, I liked it a lot. Visit the model repo for more details.

cato