Mediocre-Judge commited on
Commit
731fc3e
verified
1 Parent(s): c1e064f

End of training

Browse files
Files changed (2) hide show
  1. README.md +104 -54
  2. model.safetensors +1 -1
README.md CHANGED
@@ -16,9 +16,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.7412
20
- - Exact Match: 80.5714
21
- - F1 Score: 86.1024
22
 
23
  ## Model description
24
 
@@ -46,62 +46,112 @@ The following hyperparameters were used during training:
46
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
  - lr_scheduler_type: cosine
48
  - lr_scheduler_warmup_ratio: 0.1
49
- - training_steps: 50
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 Score |
54
  |:-------------:|:------:|:----:|:---------------:|:-----------:|:--------:|
55
- | 5.9404 | 0.0053 | 1 | 5.9382 | 0.0 | 0.5953 |
56
- | 5.9215 | 0.0107 | 2 | 5.9105 | 0.0 | 0.6034 |
57
- | 5.9271 | 0.0160 | 3 | 5.8553 | 0.0 | 0.8934 |
58
- | 5.8875 | 0.0214 | 4 | 5.7721 | 0.0 | 1.7912 |
59
- | 5.805 | 0.0267 | 5 | 5.6573 | 0.6767 | 8.4782 |
60
- | 5.6871 | 0.0321 | 6 | 5.5054 | 23.4586 | 42.2185 |
61
- | 5.5539 | 0.0374 | 7 | 5.3433 | 43.7594 | 59.1302 |
62
- | 5.4103 | 0.0428 | 8 | 5.1687 | 48.6466 | 62.0339 |
63
- | 5.2151 | 0.0481 | 9 | 4.9787 | 49.1729 | 62.6107 |
64
- | 5.0079 | 0.0535 | 10 | 4.7613 | 48.1955 | 62.6351 |
65
- | 4.9345 | 0.0588 | 11 | 4.5065 | 47.9699 | 63.1193 |
66
- | 4.6229 | 0.0641 | 12 | 4.2115 | 47.5188 | 63.2726 |
67
- | 4.4631 | 0.0695 | 13 | 3.9029 | 52.2556 | 65.5815 |
68
- | 4.1463 | 0.0748 | 14 | 3.6499 | 58.9474 | 68.4275 |
69
- | 3.8714 | 0.0802 | 15 | 3.4862 | 54.7368 | 68.5473 |
70
- | 3.5838 | 0.0855 | 16 | 3.3115 | 49.8496 | 68.1843 |
71
- | 3.5844 | 0.0909 | 17 | 3.0756 | 50.4511 | 68.9449 |
72
- | 3.3173 | 0.0962 | 18 | 2.7914 | 52.0301 | 69.3946 |
73
- | 3.0913 | 0.1016 | 19 | 2.5020 | 54.2857 | 69.9967 |
74
- | 2.9282 | 0.1069 | 20 | 2.2443 | 55.2632 | 69.9488 |
75
- | 2.711 | 0.1123 | 21 | 2.0197 | 54.2857 | 69.7818 |
76
- | 2.5538 | 0.1176 | 22 | 1.8205 | 54.5113 | 70.1570 |
77
- | 2.1309 | 0.1230 | 23 | 1.6450 | 55.8647 | 71.1004 |
78
- | 2.1398 | 0.1283 | 24 | 1.4972 | 57.3684 | 71.6500 |
79
- | 2.0536 | 0.1336 | 25 | 1.3786 | 59.3233 | 72.3145 |
80
- | 1.8463 | 0.1390 | 26 | 1.2891 | 61.2030 | 73.0502 |
81
- | 1.7279 | 0.1443 | 27 | 1.2158 | 62.7820 | 73.8093 |
82
- | 1.6699 | 0.1497 | 28 | 1.1546 | 64.4361 | 74.8352 |
83
- | 1.516 | 0.1550 | 29 | 1.1015 | 66.6917 | 75.9196 |
84
- | 1.4137 | 0.1604 | 30 | 1.0558 | 68.3459 | 77.1076 |
85
- | 1.3907 | 0.1657 | 31 | 1.0158 | 69.0977 | 77.7101 |
86
- | 1.1152 | 0.1711 | 32 | 0.9823 | 69.7744 | 78.3412 |
87
- | 1.0033 | 0.1764 | 33 | 0.9529 | 70.4511 | 78.8287 |
88
- | 1.0707 | 0.1818 | 34 | 0.9263 | 71.5038 | 79.7847 |
89
- | 1.0864 | 0.1871 | 35 | 0.9025 | 72.3308 | 80.4039 |
90
- | 1.1194 | 0.1924 | 36 | 0.8810 | 72.9323 | 80.9981 |
91
- | 0.8841 | 0.1978 | 37 | 0.8609 | 73.6842 | 81.5823 |
92
- | 1.1392 | 0.2031 | 38 | 0.8420 | 74.6617 | 82.4632 |
93
- | 1.0528 | 0.2085 | 39 | 0.8252 | 75.1880 | 82.8512 |
94
- | 1.0969 | 0.2138 | 40 | 0.8095 | 75.6391 | 83.3666 |
95
- | 1.0979 | 0.2192 | 41 | 0.7943 | 76.0902 | 83.7383 |
96
- | 0.809 | 0.2245 | 42 | 0.7808 | 76.3158 | 83.7564 |
97
- | 0.901 | 0.2299 | 43 | 0.7688 | 77.4436 | 84.4723 |
98
- | 0.8812 | 0.2352 | 44 | 0.7588 | 78.3459 | 85.1076 |
99
- | 0.8275 | 0.2406 | 45 | 0.7510 | 78.7970 | 85.4737 |
100
- | 1.0578 | 0.2459 | 46 | 0.7451 | 79.0226 | 85.5693 |
101
- | 0.8329 | 0.2513 | 47 | 0.7412 | 79.2481 | 85.8848 |
102
- | 0.8682 | 0.2566 | 48 | 0.7390 | 79.3233 | 85.9460 |
103
- | 1.0013 | 0.2619 | 49 | 0.7380 | 79.3985 | 86.0087 |
104
- | 1.0948 | 0.2673 | 50 | 0.7377 | 79.4737 | 86.0459 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
105
 
106
 
107
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.1743
20
+ - Exact Match: 96.2857
21
+ - F1 Score: 97.2732
22
 
23
  ## Model description
24
 
 
46
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
  - lr_scheduler_type: cosine
48
  - lr_scheduler_warmup_ratio: 0.1
49
+ - training_steps: 100
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 Score |
54
  |:-------------:|:------:|:----:|:---------------:|:-----------:|:--------:|
55
+ | 5.9958 | 0.0053 | 1 | 6.0009 | 0.0 | 14.2511 |
56
+ | 6.0212 | 0.0107 | 2 | 5.9880 | 0.0 | 14.2134 |
57
+ | 5.9773 | 0.0160 | 3 | 5.9623 | 0.0 | 14.2530 |
58
+ | 5.9605 | 0.0214 | 4 | 5.9240 | 0.0 | 14.7064 |
59
+ | 5.922 | 0.0267 | 5 | 5.8733 | 0.0 | 14.2565 |
60
+ | 5.8831 | 0.0321 | 6 | 5.8088 | 0.0 | 14.4523 |
61
+ | 5.8306 | 0.0374 | 7 | 5.7290 | 0.0 | 16.0421 |
62
+ | 5.7652 | 0.0428 | 8 | 5.6310 | 5.6391 | 34.9089 |
63
+ | 5.6731 | 0.0481 | 9 | 5.5117 | 17.1429 | 51.1515 |
64
+ | 5.5294 | 0.0535 | 10 | 5.3626 | 31.5789 | 59.2927 |
65
+ | 5.4532 | 0.0588 | 11 | 5.1749 | 43.9098 | 64.9926 |
66
+ | 5.2211 | 0.0641 | 12 | 4.9679 | 49.9248 | 69.2611 |
67
+ | 5.0949 | 0.0695 | 13 | 4.7511 | 53.0075 | 71.4743 |
68
+ | 4.8805 | 0.0748 | 14 | 4.5299 | 55.1880 | 72.9875 |
69
+ | 4.651 | 0.0802 | 15 | 4.2956 | 57.8195 | 74.4621 |
70
+ | 4.4113 | 0.0855 | 16 | 4.0291 | 60.0752 | 75.8558 |
71
+ | 4.2577 | 0.0909 | 17 | 3.7318 | 62.2556 | 76.3222 |
72
+ | 4.0153 | 0.0962 | 18 | 3.4295 | 63.6842 | 77.0091 |
73
+ | 3.6706 | 0.1016 | 19 | 3.1520 | 61.1278 | 76.5172 |
74
+ | 3.5342 | 0.1069 | 20 | 2.8936 | 52.7820 | 73.9467 |
75
+ | 3.2798 | 0.1123 | 21 | 2.6488 | 46.1654 | 71.9439 |
76
+ | 3.1167 | 0.1176 | 22 | 2.4104 | 47.1429 | 71.3425 |
77
+ | 2.6525 | 0.1230 | 23 | 2.1745 | 52.2556 | 71.5805 |
78
+ | 2.497 | 0.1283 | 24 | 1.9401 | 58.4211 | 72.4925 |
79
+ | 2.3689 | 0.1336 | 25 | 1.7185 | 60.2256 | 72.6269 |
80
+ | 2.0833 | 0.1390 | 26 | 1.5153 | 60.2256 | 72.8689 |
81
+ | 1.8679 | 0.1443 | 27 | 1.3483 | 61.2782 | 74.1167 |
82
+ | 1.7384 | 0.1497 | 28 | 1.2158 | 64.5865 | 76.8679 |
83
+ | 1.47 | 0.1550 | 29 | 1.1047 | 67.2932 | 78.9366 |
84
+ | 1.397 | 0.1604 | 30 | 1.0146 | 70.5263 | 81.1621 |
85
+ | 1.2822 | 0.1657 | 31 | 0.9423 | 73.3083 | 83.6952 |
86
+ | 0.9928 | 0.1711 | 32 | 0.8767 | 75.2632 | 85.0494 |
87
+ | 0.7992 | 0.1764 | 33 | 0.8122 | 77.8947 | 86.9631 |
88
+ | 0.897 | 0.1818 | 34 | 0.7455 | 80.6767 | 89.1149 |
89
+ | 0.8307 | 0.1871 | 35 | 0.6772 | 83.3835 | 91.3579 |
90
+ | 0.8469 | 0.1924 | 36 | 0.6040 | 86.2406 | 93.9573 |
91
+ | 0.6431 | 0.1978 | 37 | 0.5333 | 86.8421 | 94.6721 |
92
+ | 0.8116 | 0.2031 | 38 | 0.4519 | 87.6692 | 95.6610 |
93
+ | 0.6474 | 0.2085 | 39 | 0.3950 | 87.6692 | 95.7701 |
94
+ | 0.6241 | 0.2138 | 40 | 0.3626 | 87.6692 | 95.9608 |
95
+ | 0.6299 | 0.2192 | 41 | 0.3394 | 87.7444 | 95.9051 |
96
+ | 0.2552 | 0.2245 | 42 | 0.3260 | 87.7444 | 95.9297 |
97
+ | 0.3891 | 0.2299 | 43 | 0.3234 | 87.6692 | 95.8513 |
98
+ | 0.3552 | 0.2352 | 44 | 0.3129 | 87.9699 | 95.6941 |
99
+ | 0.2864 | 0.2406 | 45 | 0.2998 | 88.0451 | 95.3209 |
100
+ | 0.4347 | 0.2459 | 46 | 0.2798 | 89.4737 | 95.0850 |
101
+ | 0.2938 | 0.2513 | 47 | 0.2587 | 90.3759 | 94.9503 |
102
+ | 0.2821 | 0.2566 | 48 | 0.2445 | 90.9023 | 95.1257 |
103
+ | 0.3619 | 0.2619 | 49 | 0.2320 | 91.3534 | 94.9029 |
104
+ | 0.4783 | 0.2673 | 50 | 0.2176 | 91.7293 | 95.0914 |
105
+ | 0.1834 | 0.2726 | 51 | 0.2116 | 91.8797 | 95.1105 |
106
+ | 0.3803 | 0.2780 | 52 | 0.2054 | 92.1805 | 94.9606 |
107
+ | 0.2242 | 0.2833 | 53 | 0.2052 | 92.3308 | 94.9873 |
108
+ | 0.1771 | 0.2887 | 54 | 0.2033 | 92.4812 | 95.3112 |
109
+ | 0.3369 | 0.2940 | 55 | 0.1978 | 93.0827 | 95.7403 |
110
+ | 0.2277 | 0.2994 | 56 | 0.1936 | 93.7594 | 96.3688 |
111
+ | 0.2296 | 0.3047 | 57 | 0.1947 | 93.8346 | 96.6249 |
112
+ | 0.2281 | 0.3101 | 58 | 0.1939 | 93.9098 | 96.8548 |
113
+ | 0.1287 | 0.3154 | 59 | 0.1905 | 94.4361 | 96.9572 |
114
+ | 0.191 | 0.3207 | 60 | 0.1865 | 95.0376 | 97.2070 |
115
+ | 0.1435 | 0.3261 | 61 | 0.1868 | 94.9624 | 97.1697 |
116
+ | 0.1648 | 0.3314 | 62 | 0.1900 | 94.5865 | 96.8381 |
117
+ | 0.1668 | 0.3368 | 63 | 0.1889 | 94.9624 | 96.9874 |
118
+ | 0.1634 | 0.3421 | 64 | 0.1850 | 95.3383 | 97.0437 |
119
+ | 0.2374 | 0.3475 | 65 | 0.1797 | 95.7895 | 97.4394 |
120
+ | 0.1382 | 0.3528 | 66 | 0.1768 | 96.3910 | 97.6053 |
121
+ | 0.2683 | 0.3582 | 67 | 0.1736 | 96.5414 | 97.6811 |
122
+ | 0.1452 | 0.3635 | 68 | 0.1720 | 96.3910 | 97.4557 |
123
+ | 0.1796 | 0.3689 | 69 | 0.1704 | 96.4662 | 97.4221 |
124
+ | 0.0786 | 0.3742 | 70 | 0.1686 | 96.5414 | 97.4985 |
125
+ | 0.2424 | 0.3796 | 71 | 0.1669 | 96.6917 | 97.5989 |
126
+ | 0.089 | 0.3849 | 72 | 0.1656 | 96.7669 | 97.6242 |
127
+ | 0.2073 | 0.3902 | 73 | 0.1654 | 96.7669 | 97.6238 |
128
+ | 0.1657 | 0.3956 | 74 | 0.1663 | 96.5414 | 97.4733 |
129
+ | 0.0868 | 0.4009 | 75 | 0.1677 | 96.3158 | 97.4407 |
130
+ | 0.1281 | 0.4063 | 76 | 0.1697 | 96.0150 | 97.1804 |
131
+ | 0.1729 | 0.4116 | 77 | 0.1705 | 95.8647 | 97.1085 |
132
+ | 0.1871 | 0.4170 | 78 | 0.1703 | 96.0150 | 97.2090 |
133
+ | 0.1735 | 0.4223 | 79 | 0.1695 | 96.0150 | 97.2090 |
134
+ | 0.1239 | 0.4277 | 80 | 0.1700 | 95.9398 | 97.2144 |
135
+ | 0.0944 | 0.4330 | 81 | 0.1696 | 95.8647 | 97.1392 |
136
+ | 0.2494 | 0.4384 | 82 | 0.1696 | 96.0150 | 97.2896 |
137
+ | 0.0746 | 0.4437 | 83 | 0.1689 | 95.8647 | 97.1392 |
138
+ | 0.1175 | 0.4490 | 84 | 0.1680 | 96.0150 | 97.2090 |
139
+ | 0.2597 | 0.4544 | 85 | 0.1665 | 96.0902 | 97.2082 |
140
+ | 0.1567 | 0.4597 | 86 | 0.1656 | 96.0150 | 97.1330 |
141
+ | 0.0738 | 0.4651 | 87 | 0.1647 | 96.1654 | 97.2834 |
142
+ | 0.1551 | 0.4704 | 88 | 0.1641 | 96.2406 | 97.3586 |
143
+ | 0.0965 | 0.4758 | 89 | 0.1634 | 96.0902 | 97.2833 |
144
+ | 0.1466 | 0.4811 | 90 | 0.1625 | 96.1654 | 97.3085 |
145
+ | 0.115 | 0.4865 | 91 | 0.1619 | 96.6165 | 97.6096 |
146
+ | 0.1848 | 0.4918 | 92 | 0.1613 | 96.6165 | 97.5345 |
147
+ | 0.0955 | 0.4972 | 93 | 0.1607 | 96.6165 | 97.5405 |
148
+ | 0.1348 | 0.5025 | 94 | 0.1603 | 96.6165 | 97.5405 |
149
+ | 0.1316 | 0.5079 | 95 | 0.1600 | 96.6165 | 97.4655 |
150
+ | 0.1544 | 0.5132 | 96 | 0.1598 | 96.6917 | 97.5407 |
151
+ | 0.1746 | 0.5185 | 97 | 0.1596 | 96.6917 | 97.5407 |
152
+ | 0.0762 | 0.5239 | 98 | 0.1596 | 96.5414 | 97.3903 |
153
+ | 0.1685 | 0.5292 | 99 | 0.1595 | 96.5414 | 97.3903 |
154
+ | 0.1243 | 0.5346 | 100 | 0.1595 | 96.6917 | 97.5407 |
155
 
156
 
157
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f56f4be102c342698828593dd631de9b96ddb1d58e1721fe4d67191f51f9a528
3
  size 496250232
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f483d20cb76bb19d262dde819f94b4159a1a5ef3de22c8e4b5e37fe6e4968b0c
3
  size 496250232