Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MaziyarPanahi
/
Lumimaid-Magnum-v4-12B-GGUF
like
1
Text Generation
GGUF
mistral
quantized
2-bit
3-bit
4-bit precision
5-bit
6-bit
8-bit precision
GGUF
conversational
Model card
Files
Files and versions
Community
1
Train
Use this model
270ade8
Lumimaid-Magnum-v4-12B-GGUF
1 contributor
History:
7 commits
MaziyarPanahi
eb9852fd0ad0dc2978ef11fa4dfcc0c783e0fe355804430a745211312a9e2ee6
270ade8
verified
4 days ago
.gitattributes
Safe
1.94 kB
eb9852fd0ad0dc2978ef11fa4dfcc0c783e0fe355804430a745211312a9e2ee6
4 days ago
Lumimaid-Magnum-v4-12B-GGUF_imatrix.dat
7.05 MB
LFS
eb9852fd0ad0dc2978ef11fa4dfcc0c783e0fe355804430a745211312a9e2ee6
4 days ago
Lumimaid-Magnum-v4-12B.Q5_K_M.gguf
Safe
8.73 GB
LFS
1230ff63babeeeeba23dedff51300a148651faeb0c54c6ded6c2d2b5b31fd27f
4 days ago
Lumimaid-Magnum-v4-12B.Q5_K_S.gguf
Safe
8.52 GB
LFS
4094bf5aef1c8bc66192422b23698d656b7c88007f0c57e3efefaf550715f388
4 days ago
Lumimaid-Magnum-v4-12B.Q6_K.gguf
Safe
10.1 GB
LFS
a7bad517e94ad392fd81332900a9c2639b79b282bb73fe10f6a662d08833440c
4 days ago
Lumimaid-Magnum-v4-12B.Q8_0.gguf
Safe
13 GB
LFS
1626cde9193067634e60f554b354e29a7562f7123c4b7f7fcf6ca7564e681334
4 days ago
Lumimaid-Magnum-v4-12B.fp16.gguf
Safe
24.5 GB
LFS
0c544d45a1b3ab95fe275e20ade6fdcbe6b93082dc10bd56cca9eccfd9714e16
4 days ago
README.md
Safe
2.97 kB
eb9852fd0ad0dc2978ef11fa4dfcc0c783e0fe355804430a745211312a9e2ee6
4 days ago