llama-2-medical-fine-tune / generation_config.json
AK-12's picture
removing extra space
b87980f
raw
history blame
221 Bytes
{
"_from_model_config": true,
"do_sample": true,
"bos_token_id": 1,
"eos_token_id": 2,
"pad_token_id": 32000,
"temperature": 0.9,
"top_p": 0.6,
"transformers_version": "4.31.0",
"max_new_tokens": 1000
}