VinsmokeMir commited on
Commit
bfe2ed4
·
1 Parent(s): b445acf

End of training

Browse files
README.md ADDED
@@ -0,0 +1,185 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - accuracy
6
+ model-index:
7
+ - name: VinsmokeMir/FineTuning_Method_2_SC
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # VinsmokeMir/FineTuning_Method_2_SC
15
+
16
+ This model is a fine-tuned version of [rafsankabir/Pretrained_E13_Method2](https://huggingface.co/rafsankabir/Pretrained_E13_Method2) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 2.3223
19
+ - Accuracy: 0.6790
20
+ - F1 Macro: 0.6487
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 5e-05
40
+ - train_batch_size: 8
41
+ - eval_batch_size: 8
42
+ - seed: 42
43
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
+ - lr_scheduler_type: linear
45
+ - lr_scheduler_warmup_steps: 500
46
+ - num_epochs: 40
47
+ - mixed_precision_training: Native AMP
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro |
52
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|
53
+ | No log | 0.32 | 500 | 1.0745 | 0.3976 | 0.1896 |
54
+ | 1.0543 | 0.64 | 1000 | 0.9059 | 0.5967 | 0.4614 |
55
+ | 1.0543 | 0.95 | 1500 | 0.8259 | 0.6414 | 0.5633 |
56
+ | 0.8389 | 1.27 | 2000 | 0.8177 | 0.6394 | 0.5715 |
57
+ | 0.8389 | 1.59 | 2500 | 0.8269 | 0.6356 | 0.5724 |
58
+ | 0.7713 | 1.91 | 3000 | 0.7916 | 0.6631 | 0.6238 |
59
+ | 0.7713 | 2.23 | 3500 | 0.7996 | 0.6745 | 0.6155 |
60
+ | 0.6734 | 2.54 | 4000 | 0.7921 | 0.6624 | 0.6307 |
61
+ | 0.6734 | 2.86 | 4500 | 0.7743 | 0.6726 | 0.6459 |
62
+ | 0.6309 | 3.18 | 5000 | 0.8343 | 0.6803 | 0.6382 |
63
+ | 0.6309 | 3.5 | 5500 | 0.8233 | 0.6784 | 0.6390 |
64
+ | 0.5582 | 3.82 | 6000 | 0.8678 | 0.6631 | 0.6273 |
65
+ | 0.5582 | 4.13 | 6500 | 0.8621 | 0.6758 | 0.6368 |
66
+ | 0.4988 | 4.45 | 7000 | 0.9389 | 0.6720 | 0.6386 |
67
+ | 0.4988 | 4.77 | 7500 | 0.9067 | 0.6918 | 0.6505 |
68
+ | 0.4885 | 5.09 | 8000 | 0.9116 | 0.6937 | 0.6583 |
69
+ | 0.4885 | 5.41 | 8500 | 1.0357 | 0.6822 | 0.6459 |
70
+ | 0.427 | 5.73 | 9000 | 0.9428 | 0.6847 | 0.6479 |
71
+ | 0.427 | 6.04 | 9500 | 1.0233 | 0.6752 | 0.6531 |
72
+ | 0.4034 | 6.36 | 10000 | 1.1578 | 0.6835 | 0.6515 |
73
+ | 0.4034 | 6.68 | 10500 | 1.1870 | 0.6790 | 0.6545 |
74
+ | 0.4053 | 7.0 | 11000 | 1.0370 | 0.7007 | 0.6651 |
75
+ | 0.4053 | 7.32 | 11500 | 1.2087 | 0.6822 | 0.6497 |
76
+ | 0.3545 | 7.63 | 12000 | 1.2255 | 0.6847 | 0.6605 |
77
+ | 0.3545 | 7.95 | 12500 | 1.2710 | 0.6905 | 0.6609 |
78
+ | 0.3437 | 8.27 | 13000 | 1.3646 | 0.6918 | 0.6618 |
79
+ | 0.3437 | 8.59 | 13500 | 1.3767 | 0.6879 | 0.6563 |
80
+ | 0.3407 | 8.91 | 14000 | 1.2705 | 0.6796 | 0.6506 |
81
+ | 0.3407 | 9.22 | 14500 | 1.4605 | 0.6803 | 0.6496 |
82
+ | 0.2876 | 9.54 | 15000 | 1.4202 | 0.6860 | 0.6555 |
83
+ | 0.2876 | 9.86 | 15500 | 1.4151 | 0.6847 | 0.6517 |
84
+ | 0.3035 | 10.18 | 16000 | 1.4536 | 0.6713 | 0.6514 |
85
+ | 0.3035 | 10.5 | 16500 | 1.4806 | 0.6828 | 0.6469 |
86
+ | 0.2733 | 10.81 | 17000 | 1.4596 | 0.6899 | 0.6552 |
87
+ | 0.2733 | 11.13 | 17500 | 1.6183 | 0.6886 | 0.6557 |
88
+ | 0.2562 | 11.45 | 18000 | 1.6054 | 0.6771 | 0.6591 |
89
+ | 0.2562 | 11.77 | 18500 | 1.5966 | 0.6701 | 0.6503 |
90
+ | 0.2582 | 12.09 | 19000 | 1.5659 | 0.6822 | 0.6531 |
91
+ | 0.2582 | 12.4 | 19500 | 1.6146 | 0.6867 | 0.6575 |
92
+ | 0.2368 | 12.72 | 20000 | 1.6207 | 0.6899 | 0.6629 |
93
+ | 0.2368 | 13.04 | 20500 | 1.5220 | 0.6918 | 0.6640 |
94
+ | 0.245 | 13.36 | 21000 | 1.6572 | 0.6720 | 0.6489 |
95
+ | 0.245 | 13.68 | 21500 | 1.6443 | 0.6860 | 0.6590 |
96
+ | 0.2226 | 13.99 | 22000 | 1.6238 | 0.6847 | 0.6589 |
97
+ | 0.2226 | 14.31 | 22500 | 1.7241 | 0.6777 | 0.6521 |
98
+ | 0.2117 | 14.63 | 23000 | 1.6134 | 0.6867 | 0.6580 |
99
+ | 0.2117 | 14.95 | 23500 | 1.6723 | 0.6911 | 0.6618 |
100
+ | 0.2056 | 15.27 | 24000 | 1.6257 | 0.6892 | 0.6529 |
101
+ | 0.2056 | 15.59 | 24500 | 1.7072 | 0.6796 | 0.6531 |
102
+ | 0.1859 | 15.9 | 25000 | 1.7174 | 0.6771 | 0.6554 |
103
+ | 0.1859 | 16.22 | 25500 | 1.6951 | 0.6879 | 0.6555 |
104
+ | 0.1725 | 16.54 | 26000 | 1.7240 | 0.6905 | 0.6632 |
105
+ | 0.1725 | 16.86 | 26500 | 1.7126 | 0.6879 | 0.6608 |
106
+ | 0.1817 | 17.18 | 27000 | 1.7949 | 0.6847 | 0.6520 |
107
+ | 0.1817 | 17.49 | 27500 | 1.7694 | 0.6911 | 0.6622 |
108
+ | 0.1617 | 17.81 | 28000 | 1.7891 | 0.6828 | 0.6527 |
109
+ | 0.1617 | 18.13 | 28500 | 1.7860 | 0.6790 | 0.6526 |
110
+ | 0.1628 | 18.45 | 29000 | 1.8127 | 0.6867 | 0.6605 |
111
+ | 0.1628 | 18.77 | 29500 | 1.7317 | 0.6892 | 0.6610 |
112
+ | 0.1736 | 19.08 | 30000 | 1.7273 | 0.6899 | 0.6569 |
113
+ | 0.1736 | 19.4 | 30500 | 1.7853 | 0.6854 | 0.6584 |
114
+ | 0.1441 | 19.72 | 31000 | 1.7866 | 0.6918 | 0.6624 |
115
+ | 0.1441 | 20.04 | 31500 | 1.7842 | 0.6873 | 0.6580 |
116
+ | 0.1392 | 20.36 | 32000 | 1.8669 | 0.6860 | 0.6597 |
117
+ | 0.1392 | 20.67 | 32500 | 1.8392 | 0.6899 | 0.6639 |
118
+ | 0.159 | 20.99 | 33000 | 1.8412 | 0.6784 | 0.6552 |
119
+ | 0.159 | 21.31 | 33500 | 1.8673 | 0.6854 | 0.6584 |
120
+ | 0.1275 | 21.63 | 34000 | 1.8622 | 0.6854 | 0.6571 |
121
+ | 0.1275 | 21.95 | 34500 | 1.8622 | 0.6796 | 0.6583 |
122
+ | 0.1216 | 22.26 | 35000 | 1.9509 | 0.6854 | 0.6604 |
123
+ | 0.1216 | 22.58 | 35500 | 1.9425 | 0.6809 | 0.6550 |
124
+ | 0.1351 | 22.9 | 36000 | 1.9496 | 0.6784 | 0.6559 |
125
+ | 0.1351 | 23.22 | 36500 | 1.9685 | 0.6847 | 0.6582 |
126
+ | 0.1221 | 23.54 | 37000 | 1.9112 | 0.6911 | 0.6642 |
127
+ | 0.1221 | 23.85 | 37500 | 1.9341 | 0.6726 | 0.6526 |
128
+ | 0.1155 | 24.17 | 38000 | 1.9573 | 0.6899 | 0.6614 |
129
+ | 0.1155 | 24.49 | 38500 | 1.9853 | 0.6873 | 0.6580 |
130
+ | 0.1139 | 24.81 | 39000 | 1.9915 | 0.6790 | 0.6533 |
131
+ | 0.1139 | 25.13 | 39500 | 1.9997 | 0.6796 | 0.6539 |
132
+ | 0.1166 | 25.45 | 40000 | 1.9994 | 0.6847 | 0.6592 |
133
+ | 0.1166 | 25.76 | 40500 | 1.9848 | 0.6745 | 0.6513 |
134
+ | 0.1128 | 26.08 | 41000 | 2.0095 | 0.6867 | 0.6578 |
135
+ | 0.1128 | 26.4 | 41500 | 2.0585 | 0.6822 | 0.6547 |
136
+ | 0.1048 | 26.72 | 42000 | 2.0293 | 0.6777 | 0.6510 |
137
+ | 0.1048 | 27.04 | 42500 | 2.0797 | 0.6758 | 0.6512 |
138
+ | 0.1 | 27.35 | 43000 | 2.1162 | 0.6822 | 0.6544 |
139
+ | 0.1 | 27.67 | 43500 | 2.0569 | 0.6835 | 0.6538 |
140
+ | 0.1106 | 27.99 | 44000 | 2.0991 | 0.6828 | 0.6565 |
141
+ | 0.1106 | 28.31 | 44500 | 2.0976 | 0.6841 | 0.6563 |
142
+ | 0.0886 | 28.63 | 45000 | 2.1305 | 0.6854 | 0.6532 |
143
+ | 0.0886 | 28.94 | 45500 | 2.1015 | 0.6867 | 0.6564 |
144
+ | 0.1027 | 29.26 | 46000 | 2.1105 | 0.6867 | 0.6559 |
145
+ | 0.1027 | 29.58 | 46500 | 2.1396 | 0.6765 | 0.6499 |
146
+ | 0.1057 | 29.9 | 47000 | 2.1237 | 0.6790 | 0.6501 |
147
+ | 0.1057 | 30.22 | 47500 | 2.1849 | 0.6790 | 0.6518 |
148
+ | 0.0876 | 30.53 | 48000 | 2.1346 | 0.6841 | 0.6533 |
149
+ | 0.0876 | 30.85 | 48500 | 2.1441 | 0.6828 | 0.6540 |
150
+ | 0.0856 | 31.17 | 49000 | 2.1528 | 0.6911 | 0.6600 |
151
+ | 0.0856 | 31.49 | 49500 | 2.1725 | 0.6847 | 0.6509 |
152
+ | 0.0869 | 31.81 | 50000 | 2.2085 | 0.6771 | 0.6503 |
153
+ | 0.0869 | 32.12 | 50500 | 2.2606 | 0.6688 | 0.6434 |
154
+ | 0.0848 | 32.44 | 51000 | 2.2510 | 0.6745 | 0.6451 |
155
+ | 0.0848 | 32.76 | 51500 | 2.2528 | 0.6739 | 0.6496 |
156
+ | 0.0816 | 33.08 | 52000 | 2.2532 | 0.6758 | 0.6503 |
157
+ | 0.0816 | 33.4 | 52500 | 2.2356 | 0.6803 | 0.6500 |
158
+ | 0.0793 | 33.72 | 53000 | 2.2579 | 0.6745 | 0.6483 |
159
+ | 0.0793 | 34.03 | 53500 | 2.2126 | 0.6816 | 0.6520 |
160
+ | 0.0767 | 34.35 | 54000 | 2.2504 | 0.6803 | 0.6497 |
161
+ | 0.0767 | 34.67 | 54500 | 2.2601 | 0.6803 | 0.6524 |
162
+ | 0.0844 | 34.99 | 55000 | 2.2785 | 0.6733 | 0.6470 |
163
+ | 0.0844 | 35.31 | 55500 | 2.2756 | 0.6784 | 0.6520 |
164
+ | 0.0755 | 35.62 | 56000 | 2.2813 | 0.6816 | 0.6542 |
165
+ | 0.0755 | 35.94 | 56500 | 2.2752 | 0.6803 | 0.6518 |
166
+ | 0.077 | 36.26 | 57000 | 2.2815 | 0.6796 | 0.6518 |
167
+ | 0.077 | 36.58 | 57500 | 2.2861 | 0.6803 | 0.6514 |
168
+ | 0.0752 | 36.9 | 58000 | 2.2929 | 0.6771 | 0.6505 |
169
+ | 0.0752 | 37.21 | 58500 | 2.2859 | 0.6816 | 0.6537 |
170
+ | 0.0698 | 37.53 | 59000 | 2.3117 | 0.6796 | 0.6525 |
171
+ | 0.0698 | 37.85 | 59500 | 2.3038 | 0.6816 | 0.6511 |
172
+ | 0.0613 | 38.17 | 60000 | 2.3176 | 0.6765 | 0.6477 |
173
+ | 0.0613 | 38.49 | 60500 | 2.3131 | 0.6796 | 0.6493 |
174
+ | 0.0706 | 38.8 | 61000 | 2.3161 | 0.6777 | 0.6477 |
175
+ | 0.0706 | 39.12 | 61500 | 2.3127 | 0.6784 | 0.6484 |
176
+ | 0.0678 | 39.44 | 62000 | 2.3174 | 0.6765 | 0.6467 |
177
+ | 0.0678 | 39.76 | 62500 | 2.3223 | 0.6790 | 0.6487 |
178
+
179
+
180
+ ### Framework versions
181
+
182
+ - Transformers 4.29.2
183
+ - Pytorch 2.0.1+cu118
184
+ - Datasets 2.12.0
185
+ - Tokenizers 0.13.3
logs/events.out.tfevents.1684850410.5e049d1979f2.194.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:17cfe3697ff06a62099b3ea5da366adac6af7d2ea65bb8e8ee6597c2a8daa13a
3
- size 61602
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6d0dc271822bb1c731594c7497a2b5b7d2778d9825a0db927bce95c8d139fcb5
3
+ size 61962
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:79f666882754806d493fa953d2a77541dae6040df7c1ebf82558e5d15fa20231
3
  size 83267311
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e2b1d75c166f4cdb9c6f6fa7f73ae58cdf6e8dbbeb94ea9affe0cf913071fefd
3
  size 83267311