Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
jeiku
/
Aura-MoE-2x4B-v2-Q4_0-GGUF
like
0
GGUF
jeiku/Writing
FourOhFour/RP_Phase
anthracite-core/full-opus-chosen-hermes-rejected-kto-v1
English
llama-cpp
gguf-my-repo
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Aura-MoE-2x4B-v2-Q4_0-GGUF
1 contributor
History:
4 commits
jeiku
Upload ggml-model-Q4_0_4_8.gguf
d4373e0
verified
9 days ago
.gitattributes
Safe
1.64 kB
Upload ggml-model-Q4_0_4_8.gguf
9 days ago
README.md
Safe
1.86 kB
Upload README.md with huggingface_hub
12 days ago
aura-moe-2x4b-v2-q4_0.gguf
Safe
4.18 GB
LFS
Upload aura-moe-2x4b-v2-q4_0.gguf with huggingface_hub
12 days ago
ggml-model-Q4_0_4_8.gguf
Safe
4.18 GB
LFS
Upload ggml-model-Q4_0_4_8.gguf
9 days ago