Query for Magic Explaination
#8
by
markyfsun
- opened
Thank you for sharing this amazing work! I found something really interesting and wonder if you would like to share the trick behind:
- In zero-shot, this model looks like an instruct model. However, when I put it into a role-play preset, it continues the conversation well. Is this an instruct -finetuned model?
- I tested Chinese role-play as well, the model generates gorgeous Chinese words. I have never seen such wonderful output, not even from a 34B+ models. Did you build your own high-quality datasets?
After all, I would greatly appreciate it if you are kind to share your finetuning process, especially the idea of "FusionNet".
markyfsun
changed discussion status to
closed