Key Error : Mixtral

#96
by jdjayakaran - opened

When I try to use the model, i get keyerror : mixtral

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:748, in _LazyConfigMapping.getitem(self, key)
746 return self._extra_content[key]
747 if key not in self._mapping:
--> 748 raise KeyError(key)
749 value = self._mapping[key]
750 module_name = model_type_to_module_name(key)

KeyError: 'mixtral'

Successfully installed huggingface-hub-0.20.2 tokenizers-0.15.0 transformers-4.36.2

Mistral AI_ org

Could you give us the code?

Just running the instructions as given i get the error. I also tried the 4 bit quantization

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "mistralai/Mixtral-8x7B-Instruct-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_id)

model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16).to(0)

text = "Hello my name is"
inputs = tokenizer(text, return_tensors="pt").to(0)

outputs = model.generate(**inputs, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Mistral AI_ org

Usually this is caused by the transformers not being updated, if you installed the recent version then reload the kernel if you are using Jupyter or some kind of Notebook (Restart)

Perfect.. will try

Thanks for the immediate response

Mistral AI_ org

Just trying to help people out ! ;)

Hi @jdjayakaran ,
Could you please let me know if the fix suggested by @GreyForever worked for you. Because I am facing the similar issue as well

Feb 1, 2024 12:57:32 PM INFO: Traceback (most recent call last):
Feb 1, 2024 12:57:32 PM INFO: File "script", line 103, in rm_main
Feb 1, 2024 12:57:32 PM INFO: model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=data_type, revision=revision)
Feb 1, 2024 12:57:32 PM INFO: File "/opt/anaconda3/envs/rm_genai2/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
Feb 1, 2024 12:57:32 PM INFO: config, kwargs = AutoConfig.from_pretrained(
Feb 1, 2024 12:57:32 PM INFO: File "/opt/anaconda3/envs/rm_genai2/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1064, in from_pretrained
Feb 1, 2024 12:57:32 PM INFO: config_class = CONFIG_MAPPING[config_dict["model_type"]]
Feb 1, 2024 12:57:32 PM INFO: File "/opt/anaconda3/envs/rm_genai2/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 761, in getitem
Feb 1, 2024 12:57:32 PM INFO: raise KeyError(key)
Feb 1, 2024 12:57:32 PM INFO:
Feb 1, 2024 12:57:32 PM INFO: KeyError: 'mixtral' (script, line 103)

pip install -U transformers should fix the issue

Sign up or log in to comment