runtime error

The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s] 0it [00:00, ?it/s] Token will not been saved to git credential helper. Pass `add_to_git_credential=True` if you want to set the git credential as well. Traceback (most recent call last): File "app.py", line 3, in <module> from src.v2_for_hf import generate_images File "/home/user/app/src/v2_for_hf.py", line 12, in <module> login(token=os.environ.get("HF_token")) File "/home/user/.conda/envs/torch_env/lib/python3.8/site-packages/huggingface_hub/_login.py", line 109, in login _login(token, add_to_git_credential=add_to_git_credential, write_permission=write_permission) File "/home/user/.conda/envs/torch_env/lib/python3.8/site-packages/huggingface_hub/_login.py", line 305, in _login raise ValueError("Invalid token passed!") ValueError: Invalid token passed! ERROR conda.cli.main_run:execute(49): `conda run python app.py` failed. (See above for error)

Container logs:

Fetching error logs...