kera7 commited on
Commit
3b1db94
·
verified ·
1 Parent(s): 4784bfa

Added brief description

Browse files

Data released for the EMNLP 2023 paper "[Why Should This Article Be Deleted? Transparent Stance Detection in Multilingual Wikipedia Editor Discussions](https://aclanthology.org/2023.emnlp-main.361/)"

A pre-print version of the paper can be found here: [Arxiv](https://arxiv.org/abs/2310.05779)

## Column name descriptions:

*title* - Title of the Wikipedia page under consideration for deletion
*username* - Wikipedia username of the author of the comment
*timestamp* - Timestamp for the coment
*decision* - Stance label for the comment in the original language
*comment* - Text of the deletion discussion comment by a Wikipedia editor
*topic* - Topic for the stance task (Usually "Deletion of [Title]")
*en_label* - English translation of the Decision
*policy* - Wikipedia policy code relevant for the comment
*policy_title* - Title of Wikipedia policy relevant for the comment
*policy_index* - Index of the Wikipedia policy (specific to our dataset)

## Citation
If you find our dataset helpful, kindly refer to us in your work using the following citation:
```
@inproceedings{kaffee-etal-2023-article,
title = "Why Should This Article Be Deleted? Transparent Stance Detection in Multilingual {W}ikipedia Editor Discussions",
author = "Kaffee, Lucie-Aim{\'e}e and
Arora, Arnav and
Augenstein, Isabelle",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.361",
doi = "10.18653/v1/2023.emnlp-main.361",
pages = "5891--5909",
abstract = "The moderation of content on online platforms is usually non-transparent. On Wikipedia, however, this discussion is carried out publicly and editors are encouraged to use the content moderation policies as explanations for making moderation decisions. Currently, only a few comments explicitly mention those policies {--} 20{\%} of the English ones, but as few as 2{\%} of the German and Turkish comments. To aid in this process of understanding how content is moderated, we construct a novel multilingual dataset of Wikipedia editor discussions along with their reasoning in three languages. The dataset contains the stances of the editors (keep, delete, merge, comment), along with the stated reason, and a content moderation policy, for each edit decision. We demonstrate that stance and corresponding reason (policy) can be predicted jointly with a high degree of accuracy, adding transparency to the decision-making process. We release both our joint prediction models and the multilingual content moderation dataset for further research on automated transparent content moderation.",
}
```

Files changed (1) hide show
  1. README.md +10 -1
README.md CHANGED
@@ -1,3 +1,12 @@
1
  ---
2
  license: cc-by-sa-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-sa-4.0
3
+ task_categories:
4
+ - text-classification
5
+ language:
6
+ - en
7
+ - de
8
+ - tr
9
+ pretty_name: Wikipedia Deletion Discussions with stance and policy labels
10
+ size_categories:
11
+ - 100K<n<1M
12
+ ---