Meta-Llama-3-8B-Instruct-4bit
Model Details
Model Description
4-bit GPTQ quantization for Meta-Llama-3-8B-Instruct by c4 dataset
Model Sources
- Repository: Meta-Llama-3-8B-Instruct
- Downloads last month
- 69
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.