runtime error

Exit code: 1. Reason: %|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [01:23<00:00, 27.66s/it] Downloading shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [01:23<00:00, 27.95s/it] Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s] Loading checkpoint shards: 33%|β–ˆβ–ˆβ–ˆβ–Ž | 1/3 [00:09<00:19, 9.71s/it] Loading checkpoint shards: 67%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 2/3 [00:11<00:05, 5.01s/it] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [00:12<00:00, 3.40s/it] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [00:12<00:00, 4.30s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 19, in <module> pipe = pipeline("image-to-text", model=model_id, model_kwargs={"quantization_config": quantization_config}) File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 994, in pipeline tokenizer = AutoTokenizer.from_pretrained( File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 896, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2271, in from_pretrained return cls._from_pretrained( File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2505, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 157, in __init__ super().__init__( File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 115, in __init__ fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file) Exception: data did not match any variant of untagged enum ModelWrapper at line 277156 column 3

Container logs:

Fetching error logs...