Datasets:

Languages:
English
ArXiv:
License:

The downloader doesn't seem to follow redirects, and instead seems to get:

% curl https://nlp.stanford.edu/data/coqa/coqa-dev-v1.0.json 
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="https://downloads.cs.stanford.edu/nlp/data/coqa/coqa-dev-v1.0.json">here</a>.</p>
<hr>
<address>Apache/2.2.15 (CentOS) Server at nlp.stanford.edu Port 443</address>
</body></html>

with the result that:

>>> datasets.load_dataset(path="EleutherAI/coqa" ,trust_remote_code=True)
...
File ~/.cache/huggingface/modules/datasets_modules/datasets/EleutherAI--coqa/9ee0502938f8c0d66e6801438f1d1814a99e5a6cbe1e4298869bfb62c2c7596d/coqa.py:183, in Coqa._generate_examples(self, filepath, split)
    182 with open(filepath, encoding="utf-8") as f:
--> 183     data = json.load(f)
    184     for row in data["data"]:
...
JSONDecodeError: Expecting value: line 1 column 1 (char 0)
...
hails changed pull request status to merged

I am getting 503 errors lately for this dataset

File /opt/homebrew/lib/python3.11/site-packages/datasets/utils/file_utils.py:575, in get_from_cache(url, cache_dir, force_download, proxies, etag_timeout, resume_download, user_agent, local_files_only, use_etag, max_retries, token, use_auth_token, ignore_url_params, storage_options, download_desc)
    573     raise ConnectionError(f"Couldn't reach {url} ({repr(head_error)})")
    574 elif response is not None:
--> 575     raise ConnectionError(f"Couldn't reach {url} (error {response.status_code})")
    576 else:
    577     raise ConnectionError(f"Couldn't reach {url}")

ConnectionError: Couldn't reach https://downloads.cs.stanford.edu/nlp/data/coqa/coqa-dev-v1.0.json (error 503)

the link is reachable otherwise but not via HF.

Sign up or log in to comment