ValueError: The following `model_kwargs` are not used by the model: ['num_logits_to_keep'] (note: typos in the generate arguments will also show up in this list)

#1
by NeelM0906 - opened

Hello!

I fine tuned a mistral 7b model and am trying to run inference on it. I keep running into:

"""
ValueError: The following model_kwargs are not used by the model: ['num_logits_to_keep'] (note: typos in the generate arguments will also show up in this list)
""""
I looked around and none of the fixes are viable as some of them require downgrading of transformers.

I tried to:

if 'num_logits_to_keep' in inputs:
del inputs['num_logits_to_keep']

but that didn't work either.

I've attached screenshots of my prompt, inputs and the error.

Help would be appreciated!
1.png

2.png

Unsloth AI org

@NeelM0906 Just fixed - apologies on the issue! You might have to either download the LoRA adapters and reload them in Colab - you'll need to Disconnect and Delete Runtime in Colab and restart it, or you can update Unsloth in the Colab

Thanks for the quick fix!

NeelM0906 changed discussion status to closed

Sign up or log in to comment