ArtusDev's picture
Update README.md
3b8337c verified
metadata
base_model: TareksTesting/Scripturient-V2.3-LLaMa-70B
base_model_relation: quantized
quantized_by: ArtusDev
library_name: transformers
license: llama3.3
tags:
  - mergekit
  - merge
  - exl3

EXL2 Quants of TareksTesting/Scripturient-V2.3-LLaMa-70B

EXL2 quants of TareksTesting/Scripturient-V2.3-LLaMa-70B using exllamav2 for quantization.

Quants

Quant(Revision) Bits per Weight Head Bits
4.0_H6 4.0 6

Downloading quants with huggingface-cli

Click to view download instructions

Install hugginface-cli:

pip install -U "huggingface_hub[cli]"

Download quant by targeting the specific quant revision (branch):

huggingface-cli download ArtusDev/TareksTesting_Scripturient-V2.3-LLaMa-70B-EXL2 --revision "5.0bpw_H6" --local-dir ./