smol_llama-220M-open_instruct / generation_config.json
pszemraj's picture
manual upload with upload_folder.py
342bc7f
raw
history blame
133 Bytes
{
"_from_model_config": true,
"bos_token_id": 1,
"eos_token_id": 2,
"transformers_version": "4.36.2",
"use_cache": false
}