Alpaca-30B-Int4 / README.md
MetaIX's picture
Update README.md
8fe92a4
|
raw
history blame
501 Bytes

Information

Alpaca 30B 4-bit. It's working with the newest GPTQ. Quantized using --true-sequential and --act-order optimizations.

This was made using Chansung's 30B Alpaca Lora: https://huggingface.co/chansung/alpaca-lora-30b

Benchmarks

Wikitext2: 4.58

Ptb: 7.71

C4: 6.32

Note: This version does not use grouping, therefore evaluations are minimally higher. However, this version allows fitting the whole model at full context using only 24GB VRAM