File size: 94 Bytes
76659ed |
1 2 3 4 5 6 |
{
"_from_model_config": true,
"max_new_tokens": 128,
"transformers_version": "4.45.2"
}
|
76659ed |
1 2 3 4 5 6 |
{
"_from_model_config": true,
"max_new_tokens": 128,
"transformers_version": "4.45.2"
}
|