metadata
library_name: transformers
pipeline_tag: text-generation
tags:
- code
- llama
- llama-2
- text-generation-inference
base_model: codellama/CodeLlama-7b-Instruct-hf
inference: false
Mistral-7B-v0.1-GGUF
- Quantized version of CodeLlama-7b-Instruct-hf
- Created using llama.cpp
Available Quants
- Q2_K
- Q3_K_L
- Q3_K_M
- Q3_K_S
- Q4_0
- Q4_K_M
- Q4_K_S
- Q5_0
- Q5_K_M
- Q5_K_S
- Q6_K
- Q8_0