Can't load tokenizer
#1
by
vtoth
- opened
Hey, I'm trying to run your example code but got the following error:
Traceback (most recent call last):
File "/home/vik/fmri_vis/psych.py", line 6, in <module>
tokenizer = AutoTokenizer.from_pretrained("af1tang/personaGPT")
File "/home/vik/fmri_vis/venv/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 608, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/vik/fmri_vis/venv/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1775, in from_pretrained
return cls._from_pretrained(
File "/home/vik/fmri_vis/venv/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1930, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/vik/fmri_vis/venv/lib/python3.8/site-packages/transformers/models/gpt2/tokenization_gpt2_fast.py", line 138, in __init__
super().__init__(
File "/home/vik/fmri_vis/venv/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 111, in __init__
fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
Exception: No such file or directory (os error 2)```
It could not load the tokenizer locally; the same issue presents itself also on the Model card page.
Solved it by adding use_fast=False
to AutoTokenizer.from_pretrained
vtoth
changed discussion status to
closed