metadata
license: other
language:
- en
base_model:
- Nexusflow/Athene-V2-Chat
tags:
- awq
- Athene
- Chat
pipeline_tag: text-generation
library_name: transformers
Athene-V2-Chat AWQ 4-Bit Quantized Version
This repository provides the AWQ 4-bit quantized version of the Athene-V2-Chat model, originally developed by Nexusflow. This model's weights are padded with zeros before quantization to ensure compatibility with multi-GPU tensor parallelism by resolving divisibility constraints. The padding minimally impacts computation while enabling efficient scaling across multiple GPUs.