jonra1993 commited on
Commit
297a442
·
1 Parent(s): 335697e

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -21
README.md CHANGED
@@ -1,7 +1,6 @@
1
  ---
2
  license: apache-2.0
3
  tags:
4
- - text-classification
5
  - generated_from_trainer
6
  datasets:
7
  - glue
@@ -15,16 +14,18 @@ model-index:
15
  name: Text Classification
16
  type: text-classification
17
  dataset:
18
- name: datasetX
19
  type: glue
 
 
20
  args: mrpc
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
- value: 0.8406862745098039
25
  - name: F1
26
  type: f1
27
- value: 0.8811700182815356
28
  ---
29
 
30
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,11 +33,11 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  # jrtec-distilroberta-base-mrpc-glue-omar-espejel
34
 
35
- This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the datasetX dataset.
36
  It achieves the following results on the evaluation set:
37
- - Loss: 0.7217
38
- - Accuracy: 0.8407
39
- - F1: 0.8812
40
 
41
  ## Model description
42
 
@@ -67,20 +68,20 @@ The following hyperparameters were used during training:
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
69
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
70
- | 0.2883 | 1.09 | 500 | 1.0351 | 0.8333 | 0.8790 |
71
- | 0.3128 | 2.18 | 1000 | 0.7217 | 0.8407 | 0.8812 |
72
- | 0.1607 | 3.27 | 1500 | 0.9991 | 0.8480 | 0.8946 |
73
- | 0.1 | 4.36 | 2000 | 1.0454 | 0.8456 | 0.8869 |
74
- | 0.051 | 5.45 | 2500 | 1.0003 | 0.8824 | 0.9184 |
75
- | 0.037 | 6.54 | 3000 | 1.1195 | 0.8456 | 0.8948 |
76
- | 0.028 | 7.63 | 3500 | 1.0448 | 0.8725 | 0.9091 |
77
- | 0.0189 | 8.71 | 4000 | 1.0478 | 0.8725 | 0.9107 |
78
- | 0.0099 | 9.8 | 4500 | 1.0468 | 0.8775 | 0.9138 |
79
 
80
 
81
  ### Framework versions
82
 
83
- - Transformers 4.20.1
84
- - Pytorch 1.11.0
85
- - Datasets 2.1.0
86
- - Tokenizers 0.12.1
 
1
  ---
2
  license: apache-2.0
3
  tags:
 
4
  - generated_from_trainer
5
  datasets:
6
  - glue
 
14
  name: Text Classification
15
  type: text-classification
16
  dataset:
17
+ name: glue
18
  type: glue
19
+ config: mrpc
20
+ split: train
21
  args: mrpc
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.8553921568627451
26
  - name: F1
27
  type: f1
28
+ value: 0.8991452991452993
29
  ---
30
 
31
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
33
 
34
  # jrtec-distilroberta-base-mrpc-glue-omar-espejel
35
 
36
+ This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset.
37
  It achieves the following results on the evaluation set:
38
+ - Loss: 1.1849
39
+ - Accuracy: 0.8554
40
+ - F1: 0.8991
41
 
42
  ## Model description
43
 
 
68
 
69
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
70
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
71
+ | 0.4845 | 1.09 | 500 | 0.4901 | 0.8162 | 0.8748 |
72
+ | 0.3706 | 2.18 | 1000 | 0.6421 | 0.8162 | 0.8691 |
73
+ | 0.2003 | 3.27 | 1500 | 0.9711 | 0.8162 | 0.8760 |
74
+ | 0.1281 | 4.36 | 2000 | 0.8224 | 0.8480 | 0.8893 |
75
+ | 0.0717 | 5.45 | 2500 | 1.1803 | 0.8113 | 0.8511 |
76
+ | 0.0344 | 6.54 | 3000 | 1.1759 | 0.8480 | 0.8935 |
77
+ | 0.0277 | 7.63 | 3500 | 1.2140 | 0.8456 | 0.8927 |
78
+ | 0.0212 | 8.71 | 4000 | 1.0895 | 0.8554 | 0.8974 |
79
+ | 0.0071 | 9.8 | 4500 | 1.1849 | 0.8554 | 0.8991 |
80
 
81
 
82
  ### Framework versions
83
 
84
+ - Transformers 4.24.0
85
+ - Pytorch 1.12.1+cu113
86
+ - Datasets 2.6.1
87
+ - Tokenizers 0.13.1