runtime error
Exit code: 1. Reason: /usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py:896: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. warnings.warn( config.json: 0%| | 0.00/1.51k [00:00<?, ?B/s][A config.json: 100%|██████████| 1.51k/1.51k [00:00<00:00, 12.2MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> generator = pipeline("text2text-generation", model="LahiruProjects/recipe-generator-flan-t5") File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 779, in pipeline framework, model = infer_framework_load_model( File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 271, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model LahiruProjects/recipe-generator-flan-t5 with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForSeq2SeqLM'>, <class 'transformers.models.t5.modeling_t5.T5ForConditionalGeneration'>).
Container logs:
Fetching error logs...