Add support for chat tempalte in llama.cpp

#1
by amgadhasan - opened
OuteAI org

Hi,
Use monarch chat template with llama.cpp
https://github.com/ggerganov/llama.cpp/wiki/Templates-supported-by-llama_chat_apply_template

Usage: ./server -m ... --chat-template monarch
mlabonne/AlphaMonarch-7B
<s>system
test</s>
<s>user
hello</s>
<s>assistant
response</s>
<s>user
again</s>
<s>assistant
response</s>
OuteAI org

I've updated the quants. It should now detect the template properly.

edwko locked this discussion

Sign up or log in to comment