mradermacher
commited on
auto-patch README.md
Browse files
README.md
CHANGED
@@ -59,7 +59,6 @@ more details, including on how to concatenate multi-part files.
|
|
59 |
| [PART 1](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q6_K.gguf.part2of2) | Q6_K | 58.0 | very good quality |
|
60 |
| [PART 1](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q8_0.gguf.part2of2) | Q8_0 | 75.1 | fast, best quality |
|
61 |
|
62 |
-
|
63 |
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
64 |
types (lower is better):
|
65 |
|
|
|
59 |
| [PART 1](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q6_K.gguf.part2of2) | Q6_K | 58.0 | very good quality |
|
60 |
| [PART 1](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Llama3-70B-ShiningValiant2-GGUF/resolve/main/Llama3-70B-ShiningValiant2.Q8_0.gguf.part2of2) | Q8_0 | 75.1 | fast, best quality |
|
61 |
|
|
|
62 |
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
63 |
types (lower is better):
|
64 |
|