wiki_bio / README.md
SpeedOfMagic's picture
Upload README.md with huggingface_hub
e861e55 verified
|
raw
history blame
3.31 kB
metadata
{}

Dataset Card for wiki_bio

This is a preprocessed version of wiki_bio dataset for benchmarks in LM-Polygraph.

Dataset Details

Dataset Description

Dataset Sources [optional]

Uses

Direct Use

This dataset should be used for performing benchmarks on LM-polygraph.

Out-of-Scope Use

This dataset should not be used for further dataset preprocessing.

Dataset Structure

This dataset contains the "continuation" subset, which corresponds to main dataset, used in LM-Polygraph. It may also contain other subsets, which correspond to instruct methods, used in LM-Polygraph.

Each subset contains two splits: train and test. Each split contains two string columns: "input", which corresponds to processed input for LM-Polygraph, and "output", which corresponds to processed output for LM-Polygraph.

Dataset Creation

Curation Rationale

This dataset is created in order to separate dataset creation code from benchmarking code.

Source Data

Data Collection and Processing

Data is collected from https://huggingface.co/datasets/wiki_bio and processed by using build_dataset.py script in repository.

Who are the source data producers?

People who created https://huggingface.co/datasets/wiki_bio

Bias, Risks, and Limitations

This dataset contains the same biases, risks, and limitations as its source dataset https://huggingface.co/datasets/wiki_bio

Recommendations

Users should be made aware of the risks, biases and limitations of the dataset.