EXL3 Quants of TareksTesting/Scripturient-V2.2-LLaMa-70B

EXL3 quants of TareksTesting/Scripturient-V2.2-LLaMa-70B using exllamav3 for quantization.

Quants

Quant(Revision) Bits per Weight Head Bits
3.5_H6 3.5 6

Downloading quants with huggingface-cli

Click to view download instructions

Install hugginface-cli:

pip install -U "huggingface_hub[cli]"

Download quant by targeting the specific quant revision (branch):

huggingface-cli download ArtusDev/TareksTesting_Scripturient-V2.2-LLaMa-70B-EXL3 --revision "5.0bpw_H6" --local-dir ./
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ArtusDev/TareksTesting_Scripturient-V2.2-LLaMa-70B-EXL3

Quantized
(1)
this model