proper chat template
#1
by
jachermann
- opened
Hi,
I really like the performance of this model! someone put it also in the Ollama model list: https://ollama.com/cas/discolm-mfto-german. But there seems to be some misconfiguration, see the format of the stop tokens in params and the missing <|im_end|> in the template.
maybe the underlying problem is, that there is no "chat_template" in tokenizer_config.json. May you add the proper chat_template for prober use with fine-tuning libraries? That would be great.
Thanks again for this good model!