Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ibivibiv
/
giant-hydra-moe-240b
like
18
Text Generation
Transformers
Safetensors
English
mixtral
Mixture of Experts
moerge
text-generation-inference
Inference Endpoints
arxiv:
1910.09700
License:
llama2
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
e5c8832
giant-hydra-moe-240b
/
generation_config.json
ibivibiv
Upload MixtralForCausalLM
4f1200d
verified
11 months ago
raw
Copy download link
history
blame
Safe
154 Bytes
{
"_from_model_config"
:
true
,
"bos_token_id"
:
1
,
"eos_token_id"
:
2
,
"pad_token_id"
:
0
,
"transformers_version"
:
"4.37.2"
,
"use_cache"
:
false
}