- from: [https://huggingface.co/miqudev/miqu-1-70b]
- to: [https://huggingface.co/152334H/miqu-1-70b-sf]
- quantized as: 4.65bpw exl2 with exllamav2 v0.0.13
- chat template with system prompt support included.
- Works with 48GB and FP8 cache @ 32k context in TabbyAPI