runtime error
Exit code: 1. Reason: you're using `trust_remote_code=True`, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes - If you are the owner of the model architecture code, please modify your model class such that it inherits from `GenerationMixin` (after `PreTrainedModel`, otherwise you'll get an exception). - If you are not the owner of the model architecture class, please contact the model code owner to update it. Warning: Flash attention is not available, using eager attention instead. generation_config.json: 0%| | 0.00/115 [00:00<?, ?B/s][A generation_config.json: 100%|ββββββββββ| 115/115 [00:00<00:00, 708kB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 141, in <module> ).eval().cuda() File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3070, in cuda return super().cuda(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1050, in cuda return self._apply(lambda t: t.cuda(device)) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 900, in _apply module._apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 900, in _apply module._apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 900, in _apply module._apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 927, in _apply param_applied = fn(param) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1050, in <lambda> return self._apply(lambda t: t.cuda(device)) File "/usr/local/lib/python3.10/site-packages/torch/cuda/__init__.py", line 319, in _lazy_init torch._C._cuda_init() RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx
Container logs:
Fetching error logs...