Fine Tuning for Classification

#129
by MUHAMMAD-SOHAIL-ZZU - opened

Can anyone explain how I fine-tune Mistral 7b for classification purposes?

Mistral AI_ org

It might be extremely overkill to fine tune Mistral for such tasks, there are models designed specifically for classification after all... is there a reason why you want to use an LLM such as Mistral for that?

May be Mistral 7b yield valuable insights or performance improvements compared to more traditional classification models.

May be Mistral 7b yield valuable insights or performance improvements compared to more traditional classification models.

https://colab.research.google.com/drive/1Dyauq4kTZoLewQ1cApceUQVNcnnNTzg_?usp=sharing#scrollTo=2eSvM9zX_2d3

Here just optimize the probability of the yes/no tokens

It might be extremely overkill to fine tune Mistral for such tasks, there are models designed specifically for classification after all... is there a reason why you want to use an LLM such as Mistral for that?

can you leave some names/links! would be great help!

"It might be extremely overkill to fine tune Mistral for such tasks, there are models designed specifically for classification after all... is there a reason why you want to use an LLM such as Mistral for that?"

In our organization, we have found that the models performing best on classification tasks with text were generative LLMs which performed better than encoder models such as BERT and S-BERT like models. This article from octo.ai's blog suggests the same:
https://octo.ai/blog/fine-tuned-mistral-7b-delivers-over-60x-lower-costs-and-comparable-quality-to-gpt-4/

It might be extremely overkill to fine tune Mistral for such tasks, there are models designed specifically for classification after all... is there a reason why you want to use an LLM such as Mistral for that?

if you only have a small dataset with like 200 examples then decoder models actually outperform bert, as they can be prompted to output "yes" or "no" and have a decent accuracy right away.

Sign up or log in to comment