File size: 805 Bytes
d41da0f bf1f850 d41da0f bf1f850 d41da0f bf1f850 d41da0f bf1f850 d41da0f fbb0ab1 d41da0f 08c4240 d41da0f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
dataset_info:
features:
- name: TEXT
dtype: string
- name: SOURCE
dtype: string
- name: METADATA
dtype: string
splits:
- name: train
num_bytes: 5070809373
num_examples: 2525369
download_size: 1109211246
dataset_size: 5070809373
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- fa
pretty_name: Farsi Wikipedia
size_categories:
- 1M<n<10M
---
# Dataset Card for "fa-wikipedia"
This is converted version of [wikipedia-fa](https://www.kaggle.com/datasets/amirpourmand/fa-wikipedia) in order to comply with Open-assistant standards.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |