mradermacher commited on
Commit
d8d3927
1 Parent(s): c45c466

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -14,7 +14,6 @@ quantized_by: mradermacher
14
  <!-- ### vocab_type: -->
15
  static quants of https://huggingface.co/MTSAIR/MultiVerse_70B
16
 
17
-
18
  <!-- provided-files -->
19
  weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
20
  ## Usage
@@ -31,6 +30,7 @@ more details, including on how to concatenate multi-part files.
31
  |:-----|:-----|--------:|:------|
32
  | [GGUF](https://huggingface.co/mradermacher/MultiVerse_70B-GGUF/resolve/main/MultiVerse_70B.Q3_K_M.gguf) | Q3_K_M | 36.8 | lower quality |
33
  | [GGUF](https://huggingface.co/mradermacher/MultiVerse_70B-GGUF/resolve/main/MultiVerse_70B.Q4_K_S.gguf) | Q4_K_S | 42.9 | fast, recommended |
 
34
 
35
 
36
  Here is a handy graph by ikawrakow comparing some lower-quality quant
 
14
  <!-- ### vocab_type: -->
15
  static quants of https://huggingface.co/MTSAIR/MultiVerse_70B
16
 
 
17
  <!-- provided-files -->
18
  weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
19
  ## Usage
 
30
  |:-----|:-----|--------:|:------|
31
  | [GGUF](https://huggingface.co/mradermacher/MultiVerse_70B-GGUF/resolve/main/MultiVerse_70B.Q3_K_M.gguf) | Q3_K_M | 36.8 | lower quality |
32
  | [GGUF](https://huggingface.co/mradermacher/MultiVerse_70B-GGUF/resolve/main/MultiVerse_70B.Q4_K_S.gguf) | Q4_K_S | 42.9 | fast, recommended |
33
+ | [PART 1](https://huggingface.co/mradermacher/MultiVerse_70B-GGUF/resolve/main/MultiVerse_70B.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/MultiVerse_70B-GGUF/resolve/main/MultiVerse_70B.Q8_0.gguf.part2of2) | Q8_0 | 78.1 | fast, best quality |
34
 
35
 
36
  Here is a handy graph by ikawrakow comparing some lower-quality quant