Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
FelixChao
/
Magician-MoE-4x7B
like
1
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
Merge
deepseek-ai/deepseek-coder-6.7b-instruct
ise-uiuc/Magicoder-S-CL-7B
WizardLM/WizardMath-7B-V1.0
WizardLM/WizardCoder-Python-7B-V1.0
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
Magician-MoE-4x7B
/
config.json
Commit History
Upload folder using huggingface_hub
817bcf4
verified
FelixChao
commited on
Jan 17