apepkuss79 commited on
Commit
a73a50a
1 Parent(s): 9a4dbdd

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -80,7 +80,8 @@ language:
80
  | [Mistral-Large-Instruct-2407-Q3_K_L-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_L-00003-of-00003.gguf) | Q3_K_L | 3 | 4.70 GB| small, substantial quality loss |
81
  | [Mistral-Large-Instruct-2407-Q3_K_M-00001-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M-00001-of-00002.gguf) | Q3_K_M | 3 | 29.9 GB| very small, high quality loss |
82
  | [Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf) | Q3_K_M | 3 | 29.2 GB| very small, high quality loss |
83
- | [Mistral-Large-Instruct-2407-Q3_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S.gguf) | Q3_K_S | 3 | 3.17 GB| very small, high quality loss |
 
84
  | [Mistral-Large-Instruct-2407-Q4_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
85
  | [Mistral-Large-Instruct-2407-Q4_K_M.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_M.gguf) | Q4_K_M | 4 | 4.37 GB| medium, balanced quality - recommended |
86
  | [Mistral-Large-Instruct-2407-Q4_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_S.gguf) | Q4_K_S | 4 | 4.14 GB| small, greater quality loss |
 
80
  | [Mistral-Large-Instruct-2407-Q3_K_L-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_L-00003-of-00003.gguf) | Q3_K_L | 3 | 4.70 GB| small, substantial quality loss |
81
  | [Mistral-Large-Instruct-2407-Q3_K_M-00001-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M-00001-of-00002.gguf) | Q3_K_M | 3 | 29.9 GB| very small, high quality loss |
82
  | [Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf) | Q3_K_M | 3 | 29.2 GB| very small, high quality loss |
83
+ | [Mistral-Large-Instruct-2407-Q3_K_S-00001-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S-00001-of-00002.gguf) | Q3_K_S | 3 | 29.9 GB| very small, high quality loss |
84
+ | [Mistral-Large-Instruct-2407-Q3_K_S-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S-00002-of-00002.gguf) | Q3_K_S | 3 | 29.2 GB| very small, high quality loss |
85
  | [Mistral-Large-Instruct-2407-Q4_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
86
  | [Mistral-Large-Instruct-2407-Q4_K_M.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_M.gguf) | Q4_K_M | 4 | 4.37 GB| medium, balanced quality - recommended |
87
  | [Mistral-Large-Instruct-2407-Q4_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_S.gguf) | Q4_K_S | 4 | 4.14 GB| small, greater quality loss |