Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
FelixChao
/
Magician-MoE-4x7B
like
1
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
Merge
deepseek-ai/deepseek-coder-6.7b-instruct
ise-uiuc/Magicoder-S-CL-7B
WizardLM/WizardMath-7B-V1.0
WizardLM/WizardCoder-Python-7B-V1.0
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
Magician-MoE-4x7B
/
added_tokens.json
FelixChao
Upload folder using huggingface_hub
817bcf4
verified
10 months ago
raw
Copy download link
history
blame
contribute
delete
Safe
87 Bytes
{
"β<EOT>"
:
32003
,
"β<MID>"
:
32001
,
"β<PRE>"
:
32000
,
"β<SUF>"
:
32002
}