chat template discussion
#14
by
nassimab
- opened
Tried running the model using the apply_chat_template, with an open ended conversation style.
First time around, it seems like the processed prompt from the apply_chat_template gives what I would think is a valid format, with the system message included first, followed by the user input / image tag.
Upon further addition of user/assistant responses, the processed prompt seems to lose the right format? the system message doesn't get included first, but somehow the assistant message, and then system message. I didn't look too much into it but posting here to see whether anyone has had this issue or whether this is intended or it is a bug?