runtime error

Exit code: 1. Reason: .safetensors: 100%|█████████▉| 651M/651M [00:01<00:00, 385MB/s] tokenizer_config.json: 0%| | 0.00/396 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 396/396 [00:00<00:00, 2.75MB/s] vocab.txt: 0%| | 0.00/1.22M [00:00<?, ?B/s] vocab.txt: 100%|██████████| 1.22M/1.22M [00:00<00:00, 29.5MB/s] tokenizer.json: 0%| | 0.00/2.81M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 2.81M/2.81M [00:00<00:00, 75.2MB/s] special_tokens_map.json: 0%| | 0.00/125 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 125/125 [00:00<00:00, 1.09MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 6, in <module> tokenizer = AutoTokenizer.from_pretrained('SeyedAli/Persian-Text-Sentiment-Bert-V1',add_special_tokens=True) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 934, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2036, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2276, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/bert/tokenization_bert_fast.py", line 89, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 178, in __init__ super().__init__(**kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1407, in __init__ raise AttributeError(f"{key} conflicts with the method {key} in {self.__class__.__name__}") AttributeError: add_special_tokens conflicts with the method add_special_tokens in BertTokenizerFast

Container logs:

Fetching error logs...