j-hartmann
commited on
Commit
β’
87ce035
1
Parent(s):
46f4d34
Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ widget:
|
|
16 |
|
17 |
# Emotion English DistilRoBERTa-base
|
18 |
|
19 |
-
|
20 |
|
21 |
With this model, you can classify emotions in English text data. The model was trained on 6 diverse datasets (see Appendix below) and predicts Ekman's 6 basic emotions, plus a neutral class:
|
22 |
|
@@ -30,7 +30,7 @@ With this model, you can classify emotions in English text data. The model was t
|
|
30 |
|
31 |
The model is a fine-tuned checkpoint of [DistilRoBERTa-base](https://huggingface.co/distilroberta-base). For a 'non-distilled' emotion model, please refer to the model card of the [RoBERTa-large](https://huggingface.co/j-hartmann/emotion-english-roberta-large) version.
|
32 |
|
33 |
-
|
34 |
|
35 |
a) Run emotion model with 3 lines of code on single text example using Hugging Face's pipeline command on Google Colab:
|
36 |
|
@@ -40,13 +40,27 @@ b) Run emotion model on multiple examples and full datasets (e.g., .csv files) o
|
|
40 |
|
41 |
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/j-hartmann/emotion-english-distilroberta-base/blob/main/emotion_prediction_example.ipynb)
|
42 |
|
43 |
-
|
44 |
|
45 |
Please reach out to [[email protected]](mailto:[email protected]) if you have any questions or feedback.
|
46 |
|
47 |
Thanks to Samuel Domdey and chrsiebert for their support in making this model available.
|
48 |
|
49 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
50 |
|
51 |
Please find an overview of the datasets used for training below. All datasets contain English text. The table summarizes which emotions are available in each of the datasets. The datasets represent a diverse collection of text types. Specifically, they contain emotion labels for texts from Twitter, Reddit, student self-reports, and utterances from TV dialogues. As MELD (Multimodal EmotionLines Dataset) extends the popular EmotionLines dataset, EmotionLines itself is not included here.
|
52 |
|
|
|
16 |
|
17 |
# Emotion English DistilRoBERTa-base
|
18 |
|
19 |
+
# Description βΉ
|
20 |
|
21 |
With this model, you can classify emotions in English text data. The model was trained on 6 diverse datasets (see Appendix below) and predicts Ekman's 6 basic emotions, plus a neutral class:
|
22 |
|
|
|
30 |
|
31 |
The model is a fine-tuned checkpoint of [DistilRoBERTa-base](https://huggingface.co/distilroberta-base). For a 'non-distilled' emotion model, please refer to the model card of the [RoBERTa-large](https://huggingface.co/j-hartmann/emotion-english-roberta-large) version.
|
32 |
|
33 |
+
# Application π
|
34 |
|
35 |
a) Run emotion model with 3 lines of code on single text example using Hugging Face's pipeline command on Google Colab:
|
36 |
|
|
|
40 |
|
41 |
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/j-hartmann/emotion-english-distilroberta-base/blob/main/emotion_prediction_example.ipynb)
|
42 |
|
43 |
+
# Contact π»
|
44 |
|
45 |
Please reach out to [[email protected]](mailto:[email protected]) if you have any questions or feedback.
|
46 |
|
47 |
Thanks to Samuel Domdey and chrsiebert for their support in making this model available.
|
48 |
|
49 |
+
# Reference
|
50 |
+
|
51 |
+
Please cite the following reference if you use this model. A working paper will be available soon.
|
52 |
+
|
53 |
+
```
|
54 |
+
@misc{hartmann2022,
|
55 |
+
title={Emotion English DistilRoBERTa-base},
|
56 |
+
author={Hartmann, Jochen},
|
57 |
+
url={https://huggingface.co/j-hartmann/emotion-english-distilroberta-base/},
|
58 |
+
year={2022},
|
59 |
+
note={Online; accessed [INSERT_DATE]}
|
60 |
+
}
|
61 |
+
```
|
62 |
+
|
63 |
+
# Appendix π
|
64 |
|
65 |
Please find an overview of the datasets used for training below. All datasets contain English text. The table summarizes which emotions are available in each of the datasets. The datasets represent a diverse collection of text types. Specifically, they contain emotion labels for texts from Twitter, Reddit, student self-reports, and utterances from TV dialogues. As MELD (Multimodal EmotionLines Dataset) extends the popular EmotionLines dataset, EmotionLines itself is not included here.
|
66 |
|