|
---
|
|
license: apache-2.0
|
|
language:
|
|
- en
|
|
pretty_name: HackerNews comments dataset
|
|
dataset_info:
|
|
config_name: default
|
|
features:
|
|
- name: id
|
|
dtype: int32
|
|
- name: deleted
|
|
dtype: bool
|
|
- name: type
|
|
dtype: string
|
|
- name: by
|
|
dtype: string
|
|
- name: time
|
|
dtype: int64
|
|
- name: text
|
|
dtype: string
|
|
- name: dead
|
|
dtype: bool
|
|
- name: parent
|
|
dtype: int32
|
|
- name: poll
|
|
dtype: int32
|
|
- name: kids
|
|
sequence: int32
|
|
- name: url
|
|
dtype: string
|
|
- name: score
|
|
dtype: int32
|
|
- name: title
|
|
dtype: string
|
|
- name: parts
|
|
sequence: int32
|
|
- name: descendants
|
|
sequence: int32
|
|
configs:
|
|
- config_name: default
|
|
data_files:
|
|
- split: train
|
|
path: items/*.jsonl.zst
|
|
---
|
|
|
|
# Hackernews Comments Dataset
|
|
|
|
A dataset of all [HN API](https://github.com/HackerNews/API) items from `id=0` till `id=41723169` (so from 2006 till 02 Oct 2024). The dataset is build by scraping the HN API according to its official [schema and docs](https://github.com/HackerNews/API). Scraper code is also available on github: [nixiesearch/hnscrape](https://github.com/nixiesearch/hnscrape)
|
|
|
|
## Dataset contents
|
|
|
|
No cleaning, validation or filtering was performed. The resulting data files are raw JSON API response dumps in zstd-compressed JSONL files. An example payload:
|
|
|
|
```json
|
|
{
|
|
"by": "goldfish",
|
|
"descendants": 0,
|
|
"id": 46,
|
|
"score": 4,
|
|
"time": 1160581168,
|
|
"title": "Rentometer: Check How Your Rent Compares to Others in Your Area",
|
|
"type": "story",
|
|
"url": "http://www.rentometer.com/"
|
|
}
|
|
```
|
|
|
|
## Usage
|
|
|
|
You can directly load this dataset with a [Huggingface Datasets](https://github.com/huggingface/datasets/) library.
|
|
|
|
```python
|
|
todo
|
|
```
|
|
|
|
## License
|
|
|
|
Apache License 2.0. |