Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MaziyarPanahi
/
Mistral-Large-Instruct-2407-GGUF
like
20
Text Generation
GGUF
quantized
2-bit
3-bit
4-bit precision
5-bit
6-bit
8-bit precision
GGUF
Model card
Files
Files and versions
Community
14
Use this model
c9d58f4
Mistral-Large-Instruct-2407-GGUF
1 contributor
History:
17 commits
MaziyarPanahi
b7d220d91c38ab11d70c4835a59c0aa979a6c4eceb4f7a0ad8af0766850c729d
c9d58f4
verified
5 months ago
.gitattributes
Safe
3.16 kB
b7d220d91c38ab11d70c4835a59c0aa979a6c4eceb4f7a0ad8af0766850c729d
5 months ago
Mistral-Large-Instruct-2407.IQ1_M.gguf
Safe
28.4 GB
LFS
Upload folder using huggingface_hub (#2)
5 months ago
Mistral-Large-Instruct-2407.IQ1_S.gguf
Safe
26 GB
LFS
Upload folder using huggingface_hub (#2)
5 months ago
Mistral-Large-Instruct-2407.IQ2_XS.gguf
Safe
36.1 GB
LFS
Upload folder using huggingface_hub (#2)
5 months ago
Mistral-Large-Instruct-2407.IQ3_XS.gguf-00001-of-00007.gguf
Safe
8.32 GB
LFS
2ffd1a5a864adb861afe786ca6ce43fd3d8443ec9f8499d5461734aead9de9c1
5 months ago
Mistral-Large-Instruct-2407.IQ3_XS.gguf-00002-of-00007.gguf
Safe
8.07 GB
LFS
3cb5a9fa46dc77422042a0927dff3bf2e4fc2d265c5dd2e21d7e2be41318f11c
5 months ago
Mistral-Large-Instruct-2407.IQ3_XS.gguf-00003-of-00007.gguf
Safe
7.92 GB
LFS
d30cd69fb2e3c8647c8f9b4a472a2f843a429053706a0a158247d3ee8dace5e8
5 months ago
Mistral-Large-Instruct-2407.IQ3_XS.gguf-00004-of-00007.gguf
Safe
7.85 GB
LFS
3127a3330bade9a647ef52a16d3e0cd5fd43aca1566e532a619d0d15b7386ec7
5 months ago
Mistral-Large-Instruct-2407.IQ3_XS.gguf-00005-of-00007.gguf
Safe
7.85 GB
LFS
3b8ee08c400d352156a20d98628ffb9ff05525ad5d59c5330302a5e2517cbeb8
5 months ago
Mistral-Large-Instruct-2407.IQ3_XS.gguf-00006-of-00007.gguf
Safe
8.2 GB
LFS
c030531ee3381bb033d98547736cac1183e6e0db443ca2ea536d368795cb7aa7
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00001-of-00007.gguf
Safe
10.5 GB
LFS
3fa4015f36b31560decd76ac641f55b47a12a3098a1a005ac9dba8f3d8d64cb1
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00002-of-00007.gguf
Safe
10.7 GB
LFS
19ca5a5430ad352ffff6458b05de680965a8ebc92854b3d13c75a5002129b253
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00003-of-00007.gguf
Safe
10.5 GB
LFS
295ad5f8c63f0244ea3a14e0f0865ef9bd70e1f0b8ce8d4eb12e6507dd693ded
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00004-of-00007.gguf
Safe
10.4 GB
LFS
ef9541d7f6f75e11a6a6d620dc96a5670fe172fed4d3a04e22310a7c7a7c47f5
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00005-of-00007.gguf
Safe
10.4 GB
LFS
8414508202e4446043ab6c3e01fd99d137aca4cd2f2e8bf9837af95cebcd05b2
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00006-of-00007.gguf
Safe
10.5 GB
LFS
f71c8b397b2f64834b801438feb6726dfd8bc279099453de1ed48f554257a8f9
5 months ago
Mistral-Large-Instruct-2407.IQ4_XS.gguf-00007-of-00007.gguf
Safe
2.35 GB
LFS
21f7276f8a7042fb006ee156cd19e8ddccc2bff6a235a59fb257b8d96a36dce0
5 months ago
Mistral-Large-Instruct-2407.Q2_K.gguf
Safe
45.2 GB
LFS
Upload folder using huggingface_hub (#2)
5 months ago
Mistral-Large-Instruct-2407.Q3_K_L.gguf-00001-of-00007.gguf
Safe
10.4 GB
LFS
b7d220d91c38ab11d70c4835a59c0aa979a6c4eceb4f7a0ad8af0766850c729d
5 months ago
README.md
Safe
3.06 kB
Create README.md (#3)
5 months ago
main.log
Safe
22.7 kB
Upload folder using huggingface_hub (#2)
5 months ago