Help: error using model space

#1
by Amitontheweb - opened

Hi, I started getting this error today on accessing my space: Amitontheweb/InstaoffyzFreeParaphraser

What should I do? New to coding, so pls help. Thanks.


Runtime error

Space failed. Exit code: 1. Reason: alidators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/home/user/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1349, in hf_hub_download
raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/app/app.py", line 12, in
tokenizer = AutoTokenizer.from_pretrained("humarin/chatgpt_paraphraser_on_T5_base")
File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 716, in from_pretrained
config = AutoConfig.from_pretrained(
File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1034, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/user/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 620, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/user/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 675, in _get_config_dict
resolved_config_file = cached_file(
File "/home/user/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 469, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like humarin/chatgpt_paraphraser_on_T5_base is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

Resolved!

I had put temperature values. These had to be deleted. The following log message helped sort this out:

Caching examples at: '/home/user/app/gradio_cached_examples/17'
Caching example 1/4
/home/user/.local/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:381: UserWarning: do_sample is set to False. However, temperature is set to 0.7 -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset temperature.
warnings.warn(
Caching example 2/4
Caching example 3/4
Caching example 4/4

Amitontheweb changed discussion status to closed

Sign up or log in to comment