apepkuss79 commited on
Commit
570d2f8
1 Parent(s): bdba70b

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +22 -7
README.md CHANGED
@@ -82,13 +82,28 @@ language:
82
  | [Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf) | Q3_K_M | 3 | 29.2 GB| very small, high quality loss |
83
  | [Mistral-Large-Instruct-2407-Q3_K_S-00001-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S-00001-of-00002.gguf) | Q3_K_S | 3 | 29.9 GB| very small, high quality loss |
84
  | [Mistral-Large-Instruct-2407-Q3_K_S-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S-00002-of-00002.gguf) | Q3_K_S | 3 | 29.2 GB| very small, high quality loss |
85
- | [Mistral-Large-Instruct-2407-Q4_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
86
- | [Mistral-Large-Instruct-2407-Q4_K_M.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_M.gguf) | Q4_K_M | 4 | 4.37 GB| medium, balanced quality - recommended |
87
- | [Mistral-Large-Instruct-2407-Q4_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_S.gguf) | Q4_K_S | 4 | 4.14 GB| small, greater quality loss |
88
- | [Mistral-Large-Instruct-2407-Q5_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_0.gguf) | Q5_0 | 5 | 5 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
89
- | [Mistral-Large-Instruct-2407-Q5_K_M.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_M.gguf) | Q5_K_M | 5 | 5.14 GB| large, very low quality loss - recommended |
90
- | [Mistral-Large-Instruct-2407-Q5_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_S.gguf) | Q5_K_S | 5 | 5 GB| large, low quality loss - recommended |
91
- | [Mistral-Large-Instruct-2407-Q6_K.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q6_K.gguf) | Q6_K | 6 | 5.95 GB| very large, extremely low quality loss |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
92
  | [Mistral-Large-Instruct-2407-Q8_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q8_0.gguf) | Q8_0 | 8 | 7.7 GB| very large, extremely low quality loss - not recommended |
93
  | [Mistral-Large-Instruct-2407-f16.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-f16.gguf) | f16 | 16 | 14.5 GB| |
94
 
 
82
  | [Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M-00002-of-00002.gguf) | Q3_K_M | 3 | 29.2 GB| very small, high quality loss |
83
  | [Mistral-Large-Instruct-2407-Q3_K_S-00001-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S-00001-of-00002.gguf) | Q3_K_S | 3 | 29.9 GB| very small, high quality loss |
84
  | [Mistral-Large-Instruct-2407-Q3_K_S-00002-of-00002.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S-00002-of-00002.gguf) | Q3_K_S | 3 | 29.2 GB| very small, high quality loss |
85
+ | [Mistral-Large-Instruct-2407-Q4_0-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0-00001-of-00003.gguf) | Q4_0 | 4 | 30.0 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
86
+ | [Mistral-Large-Instruct-2407-Q4_0-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0-00002-of-00003.gguf) | Q4_0 | 4 | 30.0 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
87
+ | [Mistral-Large-Instruct-2407-Q4_0-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0-00003-of-00003.gguf) | Q4_0 | 4 | 9.09 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
88
+ | [Mistral-Large-Instruct-2407-Q4_K_M-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_M-00001-of-00003.gguf) | Q4_K_M | 4 | 30.0 GB| medium, balanced quality - recommended |
89
+ | [Mistral-Large-Instruct-2407-Q4_K_M-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_M-00002-of-00003.gguf) | Q4_K_M | 4 | 29.9 GB| medium, balanced quality - recommended |
90
+ | [Mistral-Large-Instruct-2407-Q4_K_M-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_M-00003-of-00003.gguf) | Q4_K_M | 4 | 13.3 GB| medium, balanced quality - recommended |
91
+ | [Mistral-Large-Instruct-2407-Q4_K_S-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_S-00001-of-00003.gguf) | Q4_K_S | 4 | 29.9 GB| small, greater quality loss |
92
+ | [Mistral-Large-Instruct-2407-Q4_K_S-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_S-00002-of-00003.gguf) | Q4_K_S | 4 | 30.0 GB| small, greater quality loss |
93
+ | [Mistral-Large-Instruct-2407-Q4_K_S-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_K_S-00003-of-00003.gguf) | Q4_K_S | 4 | 9.67 GB| small, greater quality loss |
94
+ | [Mistral-Large-Instruct-2407-Q5_0-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_0-00001-of-00003.gguf) | Q5_0 | 5 | 30.0 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
95
+ | [Mistral-Large-Instruct-2407-Q5_0-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_0-00002-of-00003.gguf) | Q5_0 | 5 | 30.0 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
96
+ | [Mistral-Large-Instruct-2407-Q5_0-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_0-00003-of-00003.gguf) | Q5_0 | 5 | 24.4 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
97
+ | [Mistral-Large-Instruct-2407-Q5_K_M-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_M-00001-of-00003.gguf) | Q5_K_M | 5 | 29.9 GB| large, very low quality loss - recommended |
98
+ | [Mistral-Large-Instruct-2407-Q5_K_M-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_M-00002-of-00003.gguf) | Q5_K_M | 5 | 29.7 GB| large, very low quality loss - recommended |
99
+ | [Mistral-Large-Instruct-2407-Q5_K_M-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_M-00003-of-00003.gguf) | Q5_K_M | 5 | 26.8 GB| large, very low quality loss - recommended |
100
+ | [Mistral-Large-Instruct-2407-Q5_K_S-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_S-00001-of-00003.gguf) | Q5_K_S | 5 | 30.0 GB| large, low quality loss - recommended |
101
+ | [Mistral-Large-Instruct-2407-Q5_K_S-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_S-00002-of-00003.gguf) | Q5_K_S | 5 | 30.0 GB| large, low quality loss - recommended |
102
+ | [Mistral-Large-Instruct-2407-Q5_K_S-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q5_K_S-00003-of-00003.gguf) | Q5_K_S | 5 | 24.4 GB| large, low quality loss - recommended |
103
+ | [Mistral-Large-Instruct-2407-Q6_K-00001-of-00004.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q6_K-00001-of-00004.gguf) | Q6_K | 6 | 29.9 GB| very large, extremely low quality loss |
104
+ | [Mistral-Large-Instruct-2407-Q6_K-00002-of-00004.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q6_K-00002-of-00004.gguf) | Q6_K | 6 | 29.8 GB| very large, extremely low quality loss |
105
+ | [Mistral-Large-Instruct-2407-Q6_K-00003-of-00004.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q6_K-00003-of-00004.gguf) | Q6_K | 6 | 29.8 GB| very large, extremely low quality loss |
106
+ | [Mistral-Large-Instruct-2407-Q6_K-00004-of-00004.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q6_K-00004-of-00004.gguf) | Q6_K | 6 | 11.1 GB| very large, extremely low quality loss |
107
  | [Mistral-Large-Instruct-2407-Q8_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q8_0.gguf) | Q8_0 | 8 | 7.7 GB| very large, extremely low quality loss - not recommended |
108
  | [Mistral-Large-Instruct-2407-f16.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-f16.gguf) | f16 | 16 | 14.5 GB| |
109