This model was converted to the OpenVINO IR format using the following command:

optimum-cli export openvino -m "input/path" --task text-generation-with-past --weight-format int4 --ratio 1 --group-size 128 --dataset wikitext2 --awq --scale-estimation --sensitivity-metric weight_quantization_error "output/path"
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for Echo9Zulu/Meta-Llama-3.1-8B-SurviveV3-int4_asym-awq-se-wqe-ov

Finetuned
(2)
this model

Collection including Echo9Zulu/Meta-Llama-3.1-8B-SurviveV3-int4_asym-awq-se-wqe-ov