Update README.md
Browse files
README.md
CHANGED
@@ -23,17 +23,17 @@ Original model: https://huggingface.co/google/gemma-2-9b
|
|
23 |
|
24 |
| Filename | Quant type | File Size | Perplexity (wikitext-2-raw-v1.test) |
|
25 |
| -------- | ---------- | --------- | ----------- |
|
26 |
-
| [gemma-2-9b.FP32.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b.FP32.gguf) | FP32 | 37.00GB |
|
27 |
-
| [gemma-2-9b-Q8_0.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q8_0.gguf) | Q8_0 | 9.83GB |
|
28 |
-
| [gemma-2-9b-Q6_K.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q6_K.gguf) | Q6_K | 7.59GB |
|
29 |
-
| [gemma-2-9b-Q5_K_M.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q5_K_M.gguf) | Q5_K_M | 6.65GB |
|
30 |
-
| [gemma-2-9b-Q5_K_S.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q5_K_S.gguf) | Q5_K_S | 6.48GB |
|
31 |
-
| [gemma-2-9b-Q4_K_M.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q4_K_M.gguf) | Q4_K_M | 5.76GB |
|
32 |
-
| [gemma-2-9b-Q4_K_S.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q4_K_S.gguf) | Q4_K_S | 5.48GB |
|
33 |
-
| [gemma-2-9b-Q3_K_L.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q3_K_L.gguf) | Q3_K_L | 5.13GB |
|
34 |
-
| [gemma-2-9b-Q3_K_M.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q3_K_M.gguf) | Q3_K_M | 4.76GB |
|
35 |
-
| [gemma-2-9b-Q3_K_S.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q3_K_S.gguf) | Q3_K_S | 4.34GB |
|
36 |
-
| [gemma-2-9b-Q2_K.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q2_K.gguf) | Q2_K | 3.81GB |
|
37 |
|
38 |
## Downloading using huggingface-cli
|
39 |
|
|
|
23 |
|
24 |
| Filename | Quant type | File Size | Perplexity (wikitext-2-raw-v1.test) |
|
25 |
| -------- | ---------- | --------- | ----------- |
|
26 |
+
| [gemma-2-9b.FP32.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b.FP32.gguf) | FP32 | 37.00GB | 6.9209 +/- 0.04660 |
|
27 |
+
| [gemma-2-9b-Q8_0.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q8_0.gguf) | Q8_0 | 9.83GB | 6.9222 +/- 0.04660 |
|
28 |
+
| [gemma-2-9b-Q6_K.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q6_K.gguf) | Q6_K | 7.59GB | 6.9353 +/- 0.04675 |
|
29 |
+
| [gemma-2-9b-Q5_K_M.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q5_K_M.gguf) | Q5_K_M | 6.65GB | 6.9571 +/- 0.04687 |
|
30 |
+
| [gemma-2-9b-Q5_K_S.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q5_K_S.gguf) | Q5_K_S | 6.48GB | 6.9623 +/- 0.04690 |
|
31 |
+
| [gemma-2-9b-Q4_K_M.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q4_K_M.gguf) | Q4_K_M | 5.76GB | 7.0220 +/- 0.04737 |
|
32 |
+
| [gemma-2-9b-Q4_K_S.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q4_K_S.gguf) | Q4_K_S | 5.48GB | 7.0622 +/- 0.04777 |
|
33 |
+
| [gemma-2-9b-Q3_K_L.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q3_K_L.gguf) | Q3_K_L | 5.13GB | 7.2144 +/- 0.04910 |
|
34 |
+
| [gemma-2-9b-Q3_K_M.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q3_K_M.gguf) | Q3_K_M | 4.76GB | 7.2849 +/- 0.04970 |
|
35 |
+
| [gemma-2-9b-Q3_K_S.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q3_K_S.gguf) | Q3_K_S | 4.34GB | 7.6869 +/- 0.05373 |
|
36 |
+
| [gemma-2-9b-Q2_K.gguf](https://huggingface.co/fedric95/gemma-2-9b-GGUF/blob/main/gemma-2-9b-Q2_K.gguf) | Q2_K | 3.81GB | 8.7979 +/- 0.06191 |
|
37 |
|
38 |
## Downloading using huggingface-cli
|
39 |
|