SharonTudi commited on
Commit
59125f1
·
verified ·
1 Parent(s): 218d6b5

End of training

Browse files
README.md CHANGED
@@ -4,10 +4,10 @@ base_model: distilbert-base-uncased
4
  tags:
5
  - generated_from_trainer
6
  metrics:
7
- - accuracy
8
  - precision
9
  - recall
10
  - f1
 
11
  model-index:
12
  - name: DIALOGUE_overfit_check
13
  results: []
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.2184
24
- - Accuracy: 0.9737
25
  - Precision: 0.9762
26
  - Recall: 0.9737
27
  - F1: 0.9736
 
28
 
29
  ## Model description
30
 
@@ -53,56 +53,56 @@ The following hyperparameters were used during training:
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
57
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
58
- | 1.0414 | 0.62 | 30 | 0.5042 | 0.9211 | 0.9327 | 0.9211 | 0.9204 |
59
- | 0.3868 | 1.25 | 60 | 0.1559 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
60
- | 0.1218 | 1.88 | 90 | 0.1743 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
61
- | 0.0363 | 2.5 | 120 | 0.1189 | 0.9474 | 0.9524 | 0.9474 | 0.9472 |
62
- | 0.0127 | 3.12 | 150 | 0.1455 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
63
- | 0.0077 | 3.75 | 180 | 0.1457 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
64
- | 0.005 | 4.38 | 210 | 0.1587 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
65
- | 0.0039 | 5.0 | 240 | 0.1620 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
66
- | 0.0031 | 5.62 | 270 | 0.1667 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
67
- | 0.0026 | 6.25 | 300 | 0.1696 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
68
- | 0.0022 | 6.88 | 330 | 0.1768 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
69
- | 0.0019 | 7.5 | 360 | 0.1802 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
70
- | 0.0016 | 8.12 | 390 | 0.1811 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
71
- | 0.0015 | 8.75 | 420 | 0.1834 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
72
- | 0.0013 | 9.38 | 450 | 0.1872 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
73
- | 0.0012 | 10.0 | 480 | 0.1890 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
74
- | 0.0011 | 10.62 | 510 | 0.1924 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
75
- | 0.001 | 11.25 | 540 | 0.1940 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
76
- | 0.0009 | 11.88 | 570 | 0.1967 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
77
- | 0.0008 | 12.5 | 600 | 0.1982 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
78
- | 0.0008 | 13.12 | 630 | 0.1995 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
79
- | 0.0008 | 13.75 | 660 | 0.2009 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
80
- | 0.0007 | 14.38 | 690 | 0.2023 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
81
- | 0.0007 | 15.0 | 720 | 0.2039 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
82
- | 0.0006 | 15.62 | 750 | 0.2049 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
83
- | 0.0006 | 16.25 | 780 | 0.2064 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
84
- | 0.0006 | 16.88 | 810 | 0.2075 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
85
- | 0.0005 | 17.5 | 840 | 0.2087 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
86
- | 0.0005 | 18.12 | 870 | 0.2101 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
87
- | 0.0005 | 18.75 | 900 | 0.2110 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
88
- | 0.0005 | 19.38 | 930 | 0.2116 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
89
- | 0.0005 | 20.0 | 960 | 0.2122 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
90
- | 0.0004 | 20.62 | 990 | 0.2130 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
91
- | 0.0004 | 21.25 | 1020 | 0.2138 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
92
- | 0.0004 | 21.88 | 1050 | 0.2143 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
93
- | 0.0004 | 22.5 | 1080 | 0.2146 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
94
- | 0.0004 | 23.12 | 1110 | 0.2152 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
95
- | 0.0004 | 23.75 | 1140 | 0.2158 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
96
- | 0.0004 | 24.38 | 1170 | 0.2162 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
97
- | 0.0004 | 25.0 | 1200 | 0.2167 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
98
- | 0.0004 | 25.62 | 1230 | 0.2170 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
99
- | 0.0004 | 26.25 | 1260 | 0.2174 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
100
- | 0.0003 | 26.88 | 1290 | 0.2177 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
101
- | 0.0003 | 27.5 | 1320 | 0.2179 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
102
- | 0.0003 | 28.12 | 1350 | 0.2181 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
103
- | 0.0003 | 28.75 | 1380 | 0.2183 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
104
- | 0.0003 | 29.38 | 1410 | 0.2183 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
105
- | 0.0004 | 30.0 | 1440 | 0.2184 | 0.9737 | 0.9762 | 0.9737 | 0.9736 |
106
 
107
 
108
  ### Framework versions
 
4
  tags:
5
  - generated_from_trainer
6
  metrics:
 
7
  - precision
8
  - recall
9
  - f1
10
+ - accuracy
11
  model-index:
12
  - name: DIALOGUE_overfit_check
13
  results: []
 
20
 
21
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.1840
 
24
  - Precision: 0.9762
25
  - Recall: 0.9737
26
  - F1: 0.9736
27
+ - Accuracy: 0.9737
28
 
29
  ## Model description
30
 
 
53
 
54
  ### Training results
55
 
56
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | 1.0266 | 0.62 | 30 | 0.5087 | 1.0 | 1.0 | 1.0 | 1.0 |
59
+ | 0.4009 | 1.25 | 60 | 0.1389 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
60
+ | 0.1301 | 1.88 | 90 | 0.1436 | 0.9637 | 0.9605 | 0.9604 | 0.9605 |
61
+ | 0.0342 | 2.5 | 120 | 0.1055 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
62
+ | 0.0288 | 3.12 | 150 | 0.1395 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
63
+ | 0.0099 | 3.75 | 180 | 0.1259 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
64
+ | 0.0057 | 4.38 | 210 | 0.1315 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
65
+ | 0.0042 | 5.0 | 240 | 0.1338 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
66
+ | 0.0033 | 5.62 | 270 | 0.1373 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
67
+ | 0.0027 | 6.25 | 300 | 0.1403 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
68
+ | 0.0024 | 6.88 | 330 | 0.1457 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
69
+ | 0.002 | 7.5 | 360 | 0.1483 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
70
+ | 0.0017 | 8.12 | 390 | 0.1483 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
71
+ | 0.0016 | 8.75 | 420 | 0.1503 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
72
+ | 0.0014 | 9.38 | 450 | 0.1535 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
73
+ | 0.0013 | 10.0 | 480 | 0.1546 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
74
+ | 0.0012 | 10.62 | 510 | 0.1576 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
75
+ | 0.0011 | 11.25 | 540 | 0.1593 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
76
+ | 0.001 | 11.88 | 570 | 0.1672 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
77
+ | 0.0009 | 12.5 | 600 | 0.1686 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
78
+ | 0.0008 | 13.12 | 630 | 0.1696 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
79
+ | 0.0008 | 13.75 | 660 | 0.1696 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
80
+ | 0.0007 | 14.38 | 690 | 0.1702 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
81
+ | 0.0007 | 15.0 | 720 | 0.1711 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
82
+ | 0.0006 | 15.62 | 750 | 0.1716 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
83
+ | 0.0006 | 16.25 | 780 | 0.1726 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
84
+ | 0.0006 | 16.88 | 810 | 0.1731 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
85
+ | 0.0006 | 17.5 | 840 | 0.1744 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
86
+ | 0.0006 | 18.12 | 870 | 0.1762 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
87
+ | 0.0005 | 18.75 | 900 | 0.1773 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
88
+ | 0.0005 | 19.38 | 930 | 0.1777 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
89
+ | 0.0005 | 20.0 | 960 | 0.1781 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
90
+ | 0.0005 | 20.62 | 990 | 0.1785 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
91
+ | 0.0004 | 21.25 | 1020 | 0.1795 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
92
+ | 0.0004 | 21.88 | 1050 | 0.1801 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
93
+ | 0.0004 | 22.5 | 1080 | 0.1805 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
94
+ | 0.0004 | 23.12 | 1110 | 0.1812 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
95
+ | 0.0004 | 23.75 | 1140 | 0.1818 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
96
+ | 0.0004 | 24.38 | 1170 | 0.1821 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
97
+ | 0.0004 | 25.0 | 1200 | 0.1824 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
98
+ | 0.0004 | 25.62 | 1230 | 0.1827 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
99
+ | 0.0004 | 26.25 | 1260 | 0.1831 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
100
+ | 0.0004 | 26.88 | 1290 | 0.1833 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
101
+ | 0.0004 | 27.5 | 1320 | 0.1836 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
102
+ | 0.0004 | 28.12 | 1350 | 0.1838 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
103
+ | 0.0004 | 28.75 | 1380 | 0.1839 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
104
+ | 0.0004 | 29.38 | 1410 | 0.1840 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
105
+ | 0.0004 | 30.0 | 1440 | 0.1840 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
106
 
107
 
108
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d8b7b048e6173bffa0feccfe2ca99bd9e8085335bd4fd6bced3f3594aec77210
3
  size 267838720
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0762e4c56734675118584bda80674b880a16381e8a3363eb60e041a924f0c132
3
  size 267838720
runs/Jan17_17-22-11_38251c6f7091/events.out.tfevents.1705512141.38251c6f7091.441.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e1e03de68573c10a0a4286f98bba917b30e825d3700bef2da79d0f8eb01aae87
3
+ size 34963
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8e206c762958ccd14dd8e02e0b778f3afab4476f3dd156847c274978d4a937d3
3
- size 4728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:01d55654f54e1149e5d0ae0246749a4a5b172c104ba2d02df269e8fe382c803b
3
+ size 4664