Error when try to run the model using example code

#2
by Nidhin117 - opened

I am getting this error when I follow the example to test the model.
ValueError: The checkpoint you are trying to load has model type tinyllava but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Any idea on how this can be fixed?
I have tried loading the latest version of transformers from git. That did not help as well

After comparing the other TinyLlava models on HF, it turns out that the config.json of TinyLlava-Qwen2 does not have the following key:
"auto_map": {
"AutoConfig": "configuration.TinyLlavaConfig",
"AutoModelForCausalLM": ""
},
I am able to load the other tinyllava models and perform inferencing without any issues in my environment.

Some models on hf are adapted to the old version of the code. The only models adapted to the new version of the code are the models in the model zoo in github.
The old version code is in the tinyllava_bench branch of GitHub.

Sign up or log in to comment