murodbek commited on
Commit
a3b7dd2
1 Parent(s): 5125bfe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -7
README.md CHANGED
@@ -20,23 +20,29 @@ model-index:
20
  results: []
21
  ---
22
 
23
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
24
- should probably proofread and complete it, then remove this comment. -->
25
-
26
  # uzroberta-sentiment-analysis
27
 
28
- This model is a fine-tuned version of [rifkat/uztext-3Gb-BPE-Roberta](https://huggingface.co/rifkat/uztext-3Gb-BPE-Roberta)
29
- on the [Uzbek App reviews for Sentiment Classification](https://github.com/SanatbekMatlatipov/uzbek-sentiment-analysis) dataset.
 
 
 
 
30
  It achieves the following results on the evaluation set:
31
  - Loss: 0.5718
32
  - Precision: 0.9113
33
  - Recall: 0.8869
34
- - F1: 0.8989
35
  - Accuracy: 0.896
36
 
37
  ## Model description
38
 
39
- More information needed
 
 
 
 
 
40
 
41
  ## Intended uses & limitations
42
 
 
20
  results: []
21
  ---
22
 
 
 
 
23
  # uzroberta-sentiment-analysis
24
 
25
+ This is a roBERTa-base model trained on ~23K reviews(~323K words) and finetuned for sentiment analysis of customer reviews. This model is built as part of author's project at the Uz-NLP 2022 Hackathon and it is suitable for Uzbek language.
26
+
27
+ <b>Labels</b>:
28
+ 0 -> Negative;
29
+ 1 -> Positive
30
+
31
  It achieves the following results on the evaluation set:
32
  - Loss: 0.5718
33
  - Precision: 0.9113
34
  - Recall: 0.8869
35
+ - F1 Score: 0.8989
36
  - Accuracy: 0.896
37
 
38
  ## Model description
39
 
40
+ This model is a fine-tuned version of [rifkat/uztext-3Gb-BPE-Roberta](https://huggingface.co/rifkat/uztext-3Gb-BPE-Roberta) on the [Uzbek App reviews for Sentiment Classification](https://github.com/SanatbekMatlatipov/uzbek-sentiment-analysis) dataset. It achieves the following results on the evaluation set:
41
+ - Loss: 0.5718
42
+ - Precision: 0.9113
43
+ - Recall: 0.8869
44
+ - F1 Score: 0.8989
45
+ - Accuracy: 0.896
46
 
47
  ## Intended uses & limitations
48