mradermacher commited on
Commit
e940497
1 Parent(s): f44dca4

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -35,10 +35,10 @@ more details, including on how to concatenate multi-part files.
35
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-IQ3_M.gguf) | i1-IQ3_M | 6.3 | |
36
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.6 | IQ3_S probably better |
37
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q3_K_L.gguf) | i1-Q3_K_L | 7.2 | IQ3_M probably better |
38
- | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.7 | almost as good as Q4_K_M |
39
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q4_K_M.gguf) | i1-Q4_K_M | 8.2 | fast, medium quality |
40
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q5_K_S.gguf) | i1-Q5_K_S | 9.3 | |
41
- | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q5_K_M.gguf) | i1-Q5_K_M | 9.5 | best weighted quant |
42
 
43
 
44
  Here is a handy graph by ikawrakow comparing some lower-quality quant
 
35
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-IQ3_M.gguf) | i1-IQ3_M | 6.3 | |
36
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.6 | IQ3_S probably better |
37
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q3_K_L.gguf) | i1-Q3_K_L | 7.2 | IQ3_M probably better |
38
+ | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.7 | optimal size/speed/quality |
39
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q4_K_M.gguf) | i1-Q4_K_M | 8.2 | fast, medium quality |
40
  | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q5_K_S.gguf) | i1-Q5_K_S | 9.3 | |
41
+ | [GGUF](https://huggingface.co/mradermacher/LLaMA2-13B-Tiefighter-i1-GGUF/resolve/main/LLaMA2-13B-Tiefighter.i1-Q5_K_M.gguf) | i1-Q5_K_M | 9.5 | |
42
 
43
 
44
  Here is a handy graph by ikawrakow comparing some lower-quality quant