Text Classification
Safetensors
deberta-v2
catherinearnett commited on
Commit
ea24df2
1 Parent(s): b8a417e

Update GitHub link

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -27,8 +27,8 @@ It classfies toxicity along five dimension:
27
  * **Violence and abuse**: overly graphic descriptions of violence, threats of violence, or calls or incitement of violence.
28
 
29
 
30
- Read more about the training details in the paper, [Toxicity of the Commons: Curating Open-Source Pre-Training Data](https://arxiv.org/pdf/2410.22587) by [Catherine Arnett](https://huggingface.co/catherinearnett), [Eliot Jones](https://huggingface.co/eliotj), Ivan P. Yamshchikov, [Pierre-Carl Langlais](https://huggingface.co/Pclanglais).
31
- For more detailed code regarding generating the annotations in Toxic Commons, training the model, and using the model, please refer to the official [GitHub](https://github.com/eliotjones1/celadon) repository.
32
 
33
 
34
  # How to Use
 
27
  * **Violence and abuse**: overly graphic descriptions of violence, threats of violence, or calls or incitement of violence.
28
 
29
 
30
+ Read more about the training details in the paper, [Toxicity of the Commons: Curating Open-Source Pre-Training Data](https://arxiv.org/pdf/2410.22587) by [Catherine Arnett](https://huggingface.co/catherinearnett), [Eliot Jones](https://huggingface.co/eliotj), [Ivan P. Yamshchikov](https://huggingface.co/ivan-the-bearable), [Pierre-Carl Langlais](https://huggingface.co/Pclanglais).
31
+ For more detailed code regarding generating the annotations in Toxic Commons, training the model, and using the model, please refer to the official [GitHub](https://github.com/Pleias/toxic-commons) repository.
32
 
33
 
34
  # How to Use