Mediocre-Judge
commited on
End of training
Browse files- README.md +104 -54
- model.safetensors +1 -1
README.md
CHANGED
@@ -16,9 +16,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 0.
|
20 |
-
- Exact Match:
|
21 |
-
- F1 Score:
|
22 |
|
23 |
## Model description
|
24 |
|
@@ -46,62 +46,112 @@ The following hyperparameters were used during training:
|
|
46 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
47 |
- lr_scheduler_type: cosine
|
48 |
- lr_scheduler_warmup_ratio: 0.1
|
49 |
-
- training_steps:
|
50 |
|
51 |
### Training results
|
52 |
|
53 |
| Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 Score |
|
54 |
|:-------------:|:------:|:----:|:---------------:|:-----------:|:--------:|
|
55 |
-
| 5.
|
56 |
-
|
|
57 |
-
| 5.
|
58 |
-
| 5.
|
59 |
-
| 5.
|
60 |
-
| 5.
|
61 |
-
| 5.
|
62 |
-
| 5.
|
63 |
-
| 5.
|
64 |
-
| 5.
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
| 4.
|
69 |
-
|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
| 3.
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
| 2.
|
78 |
-
| 2.
|
79 |
-
| 2.
|
80 |
-
|
|
81 |
-
| 1.
|
82 |
-
| 1.
|
83 |
-
| 1.
|
84 |
-
| 1.
|
85 |
-
| 1.
|
86 |
-
|
|
87 |
-
|
|
88 |
-
|
|
89 |
-
|
|
90 |
-
|
|
91 |
-
| 0.
|
92 |
-
|
|
93 |
-
|
|
94 |
-
|
|
95 |
-
|
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
|
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
|
|
104 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
105 |
|
106 |
|
107 |
### Framework versions
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 0.1743
|
20 |
+
- Exact Match: 96.2857
|
21 |
+
- F1 Score: 97.2732
|
22 |
|
23 |
## Model description
|
24 |
|
|
|
46 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
47 |
- lr_scheduler_type: cosine
|
48 |
- lr_scheduler_warmup_ratio: 0.1
|
49 |
+
- training_steps: 100
|
50 |
|
51 |
### Training results
|
52 |
|
53 |
| Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 Score |
|
54 |
|:-------------:|:------:|:----:|:---------------:|:-----------:|:--------:|
|
55 |
+
| 5.9958 | 0.0053 | 1 | 6.0009 | 0.0 | 14.2511 |
|
56 |
+
| 6.0212 | 0.0107 | 2 | 5.9880 | 0.0 | 14.2134 |
|
57 |
+
| 5.9773 | 0.0160 | 3 | 5.9623 | 0.0 | 14.2530 |
|
58 |
+
| 5.9605 | 0.0214 | 4 | 5.9240 | 0.0 | 14.7064 |
|
59 |
+
| 5.922 | 0.0267 | 5 | 5.8733 | 0.0 | 14.2565 |
|
60 |
+
| 5.8831 | 0.0321 | 6 | 5.8088 | 0.0 | 14.4523 |
|
61 |
+
| 5.8306 | 0.0374 | 7 | 5.7290 | 0.0 | 16.0421 |
|
62 |
+
| 5.7652 | 0.0428 | 8 | 5.6310 | 5.6391 | 34.9089 |
|
63 |
+
| 5.6731 | 0.0481 | 9 | 5.5117 | 17.1429 | 51.1515 |
|
64 |
+
| 5.5294 | 0.0535 | 10 | 5.3626 | 31.5789 | 59.2927 |
|
65 |
+
| 5.4532 | 0.0588 | 11 | 5.1749 | 43.9098 | 64.9926 |
|
66 |
+
| 5.2211 | 0.0641 | 12 | 4.9679 | 49.9248 | 69.2611 |
|
67 |
+
| 5.0949 | 0.0695 | 13 | 4.7511 | 53.0075 | 71.4743 |
|
68 |
+
| 4.8805 | 0.0748 | 14 | 4.5299 | 55.1880 | 72.9875 |
|
69 |
+
| 4.651 | 0.0802 | 15 | 4.2956 | 57.8195 | 74.4621 |
|
70 |
+
| 4.4113 | 0.0855 | 16 | 4.0291 | 60.0752 | 75.8558 |
|
71 |
+
| 4.2577 | 0.0909 | 17 | 3.7318 | 62.2556 | 76.3222 |
|
72 |
+
| 4.0153 | 0.0962 | 18 | 3.4295 | 63.6842 | 77.0091 |
|
73 |
+
| 3.6706 | 0.1016 | 19 | 3.1520 | 61.1278 | 76.5172 |
|
74 |
+
| 3.5342 | 0.1069 | 20 | 2.8936 | 52.7820 | 73.9467 |
|
75 |
+
| 3.2798 | 0.1123 | 21 | 2.6488 | 46.1654 | 71.9439 |
|
76 |
+
| 3.1167 | 0.1176 | 22 | 2.4104 | 47.1429 | 71.3425 |
|
77 |
+
| 2.6525 | 0.1230 | 23 | 2.1745 | 52.2556 | 71.5805 |
|
78 |
+
| 2.497 | 0.1283 | 24 | 1.9401 | 58.4211 | 72.4925 |
|
79 |
+
| 2.3689 | 0.1336 | 25 | 1.7185 | 60.2256 | 72.6269 |
|
80 |
+
| 2.0833 | 0.1390 | 26 | 1.5153 | 60.2256 | 72.8689 |
|
81 |
+
| 1.8679 | 0.1443 | 27 | 1.3483 | 61.2782 | 74.1167 |
|
82 |
+
| 1.7384 | 0.1497 | 28 | 1.2158 | 64.5865 | 76.8679 |
|
83 |
+
| 1.47 | 0.1550 | 29 | 1.1047 | 67.2932 | 78.9366 |
|
84 |
+
| 1.397 | 0.1604 | 30 | 1.0146 | 70.5263 | 81.1621 |
|
85 |
+
| 1.2822 | 0.1657 | 31 | 0.9423 | 73.3083 | 83.6952 |
|
86 |
+
| 0.9928 | 0.1711 | 32 | 0.8767 | 75.2632 | 85.0494 |
|
87 |
+
| 0.7992 | 0.1764 | 33 | 0.8122 | 77.8947 | 86.9631 |
|
88 |
+
| 0.897 | 0.1818 | 34 | 0.7455 | 80.6767 | 89.1149 |
|
89 |
+
| 0.8307 | 0.1871 | 35 | 0.6772 | 83.3835 | 91.3579 |
|
90 |
+
| 0.8469 | 0.1924 | 36 | 0.6040 | 86.2406 | 93.9573 |
|
91 |
+
| 0.6431 | 0.1978 | 37 | 0.5333 | 86.8421 | 94.6721 |
|
92 |
+
| 0.8116 | 0.2031 | 38 | 0.4519 | 87.6692 | 95.6610 |
|
93 |
+
| 0.6474 | 0.2085 | 39 | 0.3950 | 87.6692 | 95.7701 |
|
94 |
+
| 0.6241 | 0.2138 | 40 | 0.3626 | 87.6692 | 95.9608 |
|
95 |
+
| 0.6299 | 0.2192 | 41 | 0.3394 | 87.7444 | 95.9051 |
|
96 |
+
| 0.2552 | 0.2245 | 42 | 0.3260 | 87.7444 | 95.9297 |
|
97 |
+
| 0.3891 | 0.2299 | 43 | 0.3234 | 87.6692 | 95.8513 |
|
98 |
+
| 0.3552 | 0.2352 | 44 | 0.3129 | 87.9699 | 95.6941 |
|
99 |
+
| 0.2864 | 0.2406 | 45 | 0.2998 | 88.0451 | 95.3209 |
|
100 |
+
| 0.4347 | 0.2459 | 46 | 0.2798 | 89.4737 | 95.0850 |
|
101 |
+
| 0.2938 | 0.2513 | 47 | 0.2587 | 90.3759 | 94.9503 |
|
102 |
+
| 0.2821 | 0.2566 | 48 | 0.2445 | 90.9023 | 95.1257 |
|
103 |
+
| 0.3619 | 0.2619 | 49 | 0.2320 | 91.3534 | 94.9029 |
|
104 |
+
| 0.4783 | 0.2673 | 50 | 0.2176 | 91.7293 | 95.0914 |
|
105 |
+
| 0.1834 | 0.2726 | 51 | 0.2116 | 91.8797 | 95.1105 |
|
106 |
+
| 0.3803 | 0.2780 | 52 | 0.2054 | 92.1805 | 94.9606 |
|
107 |
+
| 0.2242 | 0.2833 | 53 | 0.2052 | 92.3308 | 94.9873 |
|
108 |
+
| 0.1771 | 0.2887 | 54 | 0.2033 | 92.4812 | 95.3112 |
|
109 |
+
| 0.3369 | 0.2940 | 55 | 0.1978 | 93.0827 | 95.7403 |
|
110 |
+
| 0.2277 | 0.2994 | 56 | 0.1936 | 93.7594 | 96.3688 |
|
111 |
+
| 0.2296 | 0.3047 | 57 | 0.1947 | 93.8346 | 96.6249 |
|
112 |
+
| 0.2281 | 0.3101 | 58 | 0.1939 | 93.9098 | 96.8548 |
|
113 |
+
| 0.1287 | 0.3154 | 59 | 0.1905 | 94.4361 | 96.9572 |
|
114 |
+
| 0.191 | 0.3207 | 60 | 0.1865 | 95.0376 | 97.2070 |
|
115 |
+
| 0.1435 | 0.3261 | 61 | 0.1868 | 94.9624 | 97.1697 |
|
116 |
+
| 0.1648 | 0.3314 | 62 | 0.1900 | 94.5865 | 96.8381 |
|
117 |
+
| 0.1668 | 0.3368 | 63 | 0.1889 | 94.9624 | 96.9874 |
|
118 |
+
| 0.1634 | 0.3421 | 64 | 0.1850 | 95.3383 | 97.0437 |
|
119 |
+
| 0.2374 | 0.3475 | 65 | 0.1797 | 95.7895 | 97.4394 |
|
120 |
+
| 0.1382 | 0.3528 | 66 | 0.1768 | 96.3910 | 97.6053 |
|
121 |
+
| 0.2683 | 0.3582 | 67 | 0.1736 | 96.5414 | 97.6811 |
|
122 |
+
| 0.1452 | 0.3635 | 68 | 0.1720 | 96.3910 | 97.4557 |
|
123 |
+
| 0.1796 | 0.3689 | 69 | 0.1704 | 96.4662 | 97.4221 |
|
124 |
+
| 0.0786 | 0.3742 | 70 | 0.1686 | 96.5414 | 97.4985 |
|
125 |
+
| 0.2424 | 0.3796 | 71 | 0.1669 | 96.6917 | 97.5989 |
|
126 |
+
| 0.089 | 0.3849 | 72 | 0.1656 | 96.7669 | 97.6242 |
|
127 |
+
| 0.2073 | 0.3902 | 73 | 0.1654 | 96.7669 | 97.6238 |
|
128 |
+
| 0.1657 | 0.3956 | 74 | 0.1663 | 96.5414 | 97.4733 |
|
129 |
+
| 0.0868 | 0.4009 | 75 | 0.1677 | 96.3158 | 97.4407 |
|
130 |
+
| 0.1281 | 0.4063 | 76 | 0.1697 | 96.0150 | 97.1804 |
|
131 |
+
| 0.1729 | 0.4116 | 77 | 0.1705 | 95.8647 | 97.1085 |
|
132 |
+
| 0.1871 | 0.4170 | 78 | 0.1703 | 96.0150 | 97.2090 |
|
133 |
+
| 0.1735 | 0.4223 | 79 | 0.1695 | 96.0150 | 97.2090 |
|
134 |
+
| 0.1239 | 0.4277 | 80 | 0.1700 | 95.9398 | 97.2144 |
|
135 |
+
| 0.0944 | 0.4330 | 81 | 0.1696 | 95.8647 | 97.1392 |
|
136 |
+
| 0.2494 | 0.4384 | 82 | 0.1696 | 96.0150 | 97.2896 |
|
137 |
+
| 0.0746 | 0.4437 | 83 | 0.1689 | 95.8647 | 97.1392 |
|
138 |
+
| 0.1175 | 0.4490 | 84 | 0.1680 | 96.0150 | 97.2090 |
|
139 |
+
| 0.2597 | 0.4544 | 85 | 0.1665 | 96.0902 | 97.2082 |
|
140 |
+
| 0.1567 | 0.4597 | 86 | 0.1656 | 96.0150 | 97.1330 |
|
141 |
+
| 0.0738 | 0.4651 | 87 | 0.1647 | 96.1654 | 97.2834 |
|
142 |
+
| 0.1551 | 0.4704 | 88 | 0.1641 | 96.2406 | 97.3586 |
|
143 |
+
| 0.0965 | 0.4758 | 89 | 0.1634 | 96.0902 | 97.2833 |
|
144 |
+
| 0.1466 | 0.4811 | 90 | 0.1625 | 96.1654 | 97.3085 |
|
145 |
+
| 0.115 | 0.4865 | 91 | 0.1619 | 96.6165 | 97.6096 |
|
146 |
+
| 0.1848 | 0.4918 | 92 | 0.1613 | 96.6165 | 97.5345 |
|
147 |
+
| 0.0955 | 0.4972 | 93 | 0.1607 | 96.6165 | 97.5405 |
|
148 |
+
| 0.1348 | 0.5025 | 94 | 0.1603 | 96.6165 | 97.5405 |
|
149 |
+
| 0.1316 | 0.5079 | 95 | 0.1600 | 96.6165 | 97.4655 |
|
150 |
+
| 0.1544 | 0.5132 | 96 | 0.1598 | 96.6917 | 97.5407 |
|
151 |
+
| 0.1746 | 0.5185 | 97 | 0.1596 | 96.6917 | 97.5407 |
|
152 |
+
| 0.0762 | 0.5239 | 98 | 0.1596 | 96.5414 | 97.3903 |
|
153 |
+
| 0.1685 | 0.5292 | 99 | 0.1595 | 96.5414 | 97.3903 |
|
154 |
+
| 0.1243 | 0.5346 | 100 | 0.1595 | 96.6917 | 97.5407 |
|
155 |
|
156 |
|
157 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 496250232
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f483d20cb76bb19d262dde819f94b4159a1a5ef3de22c8e4b5e37fe6e4968b0c
|
3 |
size 496250232
|