SharonTudi commited on
Commit
abd10d8
·
verified ·
1 Parent(s): 0ca5034

End of training

Browse files
README.md CHANGED
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.2924
24
- - Precision: 0.9421
25
- - Recall: 0.9342
26
- - F1: 0.9313
27
- - Accuracy: 0.9342
28
 
29
  ## Model description
30
 
@@ -55,54 +55,54 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | 0.9048 | 0.31 | 15 | 0.3535 | 0.9241 | 0.9079 | 0.9067 | 0.9079 |
59
- | 0.2596 | 0.62 | 30 | 0.2208 | 0.9421 | 0.9342 | 0.9339 | 0.9342 |
60
- | 0.3006 | 0.94 | 45 | 0.1396 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
61
- | 0.099 | 1.25 | 60 | 0.0830 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
62
- | 0.0736 | 1.56 | 75 | 0.1756 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
63
- | 0.0903 | 1.88 | 90 | 0.1981 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
64
- | 0.0038 | 2.19 | 105 | 0.2455 | 0.9611 | 0.9605 | 0.9605 | 0.9605 |
65
- | 0.086 | 2.5 | 120 | 0.4242 | 0.9373 | 0.9342 | 0.9341 | 0.9342 |
66
- | 0.1077 | 2.81 | 135 | 0.2115 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
67
- | 0.0599 | 3.12 | 150 | 0.0653 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
68
- | 0.0462 | 3.44 | 165 | 0.1517 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
69
- | 0.0015 | 3.75 | 180 | 0.2302 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
70
- | 0.039 | 4.06 | 195 | 0.2066 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
71
- | 0.004 | 4.38 | 210 | 0.2157 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
72
- | 0.0311 | 4.69 | 225 | 0.5658 | 0.9241 | 0.9079 | 0.9067 | 0.9079 |
73
- | 0.1224 | 5.0 | 240 | 0.2216 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
74
- | 0.0006 | 5.31 | 255 | 0.2725 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
75
- | 0.0006 | 5.62 | 270 | 0.2802 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
76
- | 0.0005 | 5.94 | 285 | 0.2813 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
77
- | 0.0004 | 6.25 | 300 | 0.2815 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
78
- | 0.0004 | 6.56 | 315 | 0.2812 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
79
- | 0.0003 | 6.88 | 330 | 0.2815 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
80
- | 0.0004 | 7.19 | 345 | 0.2833 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
81
- | 0.0003 | 7.5 | 360 | 0.2797 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
82
- | 0.0003 | 7.81 | 375 | 0.2799 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
83
- | 0.0003 | 8.12 | 390 | 0.2797 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
84
- | 0.0003 | 8.44 | 405 | 0.2811 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
85
- | 0.0003 | 8.75 | 420 | 0.2820 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
86
- | 0.0002 | 9.06 | 435 | 0.2837 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
87
- | 0.0002 | 9.38 | 450 | 0.2853 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
88
- | 0.0002 | 9.69 | 465 | 0.2854 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
89
- | 0.0002 | 10.0 | 480 | 0.2857 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
90
- | 0.0002 | 10.31 | 495 | 0.2864 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
91
- | 0.0002 | 10.62 | 510 | 0.2866 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
92
- | 0.0002 | 10.94 | 525 | 0.2873 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
93
- | 0.0002 | 11.25 | 540 | 0.2881 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
94
- | 0.0002 | 11.56 | 555 | 0.2890 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
95
- | 0.0002 | 11.88 | 570 | 0.2894 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
96
- | 0.0002 | 12.19 | 585 | 0.2903 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
97
- | 0.0002 | 12.5 | 600 | 0.2908 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
98
- | 0.0002 | 12.81 | 615 | 0.2911 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
99
- | 0.0002 | 13.12 | 630 | 0.2914 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
100
- | 0.0002 | 13.44 | 645 | 0.2918 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
101
- | 0.0002 | 13.75 | 660 | 0.2921 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
102
- | 0.0002 | 14.06 | 675 | 0.2923 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
103
- | 0.0002 | 14.38 | 690 | 0.2924 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
104
- | 0.0002 | 14.69 | 705 | 0.2925 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
105
- | 0.0002 | 15.0 | 720 | 0.2924 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
106
 
107
 
108
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.1605
24
+ - Precision: 0.9762
25
+ - Recall: 0.9737
26
+ - F1: 0.9736
27
+ - Accuracy: 0.9737
28
 
29
  ## Model description
30
 
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | 0.9264 | 0.31 | 15 | 0.3711 | 0.9241 | 0.9079 | 0.9067 | 0.9079 |
59
+ | 0.2288 | 0.62 | 30 | 0.1532 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
60
+ | 0.278 | 0.94 | 45 | 0.1236 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
61
+ | 0.1255 | 1.25 | 60 | 0.1636 | 0.9421 | 0.9342 | 0.9313 | 0.9342 |
62
+ | 0.1979 | 1.56 | 75 | 0.2056 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
63
+ | 0.1794 | 1.88 | 90 | 0.5392 | 0.9241 | 0.9079 | 0.9067 | 0.9079 |
64
+ | 0.0311 | 2.19 | 105 | 0.1639 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
65
+ | 0.0586 | 2.5 | 120 | 0.3969 | 0.9270 | 0.9211 | 0.9208 | 0.9211 |
66
+ | 0.134 | 2.81 | 135 | 0.3752 | 0.9534 | 0.9474 | 0.9471 | 0.9474 |
67
+ | 0.0811 | 3.12 | 150 | 0.4049 | 0.9518 | 0.9474 | 0.9472 | 0.9474 |
68
+ | 0.0584 | 3.44 | 165 | 0.4468 | 0.9280 | 0.9211 | 0.9182 | 0.9211 |
69
+ | 0.0027 | 3.75 | 180 | 0.4233 | 0.9518 | 0.9474 | 0.9472 | 0.9474 |
70
+ | 0.07 | 4.06 | 195 | 0.6047 | 0.9280 | 0.9211 | 0.9182 | 0.9211 |
71
+ | 0.0037 | 4.38 | 210 | 0.5789 | 0.9280 | 0.9211 | 0.9182 | 0.9211 |
72
+ | 0.049 | 4.69 | 225 | 0.4404 | 0.9421 | 0.9342 | 0.9339 | 0.9342 |
73
+ | 0.0728 | 5.0 | 240 | 0.1398 | 0.9637 | 0.9605 | 0.9604 | 0.9605 |
74
+ | 0.001 | 5.31 | 255 | 0.1255 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
75
+ | 0.0009 | 5.62 | 270 | 0.1263 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
76
+ | 0.0007 | 5.94 | 285 | 0.1291 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
77
+ | 0.0006 | 6.25 | 300 | 0.1319 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
78
+ | 0.0006 | 6.56 | 315 | 0.1340 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
79
+ | 0.0005 | 6.88 | 330 | 0.1367 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
80
+ | 0.0005 | 7.19 | 345 | 0.1390 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
81
+ | 0.0005 | 7.5 | 360 | 0.1420 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
82
+ | 0.0004 | 7.81 | 375 | 0.1437 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
83
+ | 0.0004 | 8.12 | 390 | 0.1449 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
84
+ | 0.0004 | 8.44 | 405 | 0.1459 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
85
+ | 0.0004 | 8.75 | 420 | 0.1477 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
86
+ | 0.0003 | 9.06 | 435 | 0.1492 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
87
+ | 0.0003 | 9.38 | 450 | 0.1501 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
88
+ | 0.0003 | 9.69 | 465 | 0.1513 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
89
+ | 0.0003 | 10.0 | 480 | 0.1527 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
90
+ | 0.0003 | 10.31 | 495 | 0.1534 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
91
+ | 0.0003 | 10.62 | 510 | 0.1544 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
92
+ | 0.0003 | 10.94 | 525 | 0.1553 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
93
+ | 0.0003 | 11.25 | 540 | 0.1560 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
94
+ | 0.0003 | 11.56 | 555 | 0.1565 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
95
+ | 0.0003 | 11.88 | 570 | 0.1570 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
96
+ | 0.0003 | 12.19 | 585 | 0.1579 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
97
+ | 0.0002 | 12.5 | 600 | 0.1587 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
98
+ | 0.0002 | 12.81 | 615 | 0.1590 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
99
+ | 0.0002 | 13.12 | 630 | 0.1594 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
100
+ | 0.0002 | 13.44 | 645 | 0.1598 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
101
+ | 0.0002 | 13.75 | 660 | 0.1601 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
102
+ | 0.0002 | 14.06 | 675 | 0.1603 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
103
+ | 0.0002 | 14.38 | 690 | 0.1604 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
104
+ | 0.0002 | 14.69 | 705 | 0.1605 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
105
+ | 0.0002 | 15.0 | 720 | 0.1605 | 0.9762 | 0.9737 | 0.9736 | 0.9737 |
106
 
107
 
108
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5785653d559029a119471d50689ee8e451a3d977d7ec2854874dd180aea63772
3
  size 267838720
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ebd0a7450bc22451612c82a0c945ed2869eb9aa999660c52cc592d206b26c101
3
  size 267838720
runs/Jan18_16-48-52_a40b44969e29/events.out.tfevents.1705596534.a40b44969e29.1174.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:841d981e386dc7cdb773a7a5564e4254ba3b55801cca7f2033a9d073b28223bc
3
+ size 4618
runs/Jan18_16-49-26_a40b44969e29/events.out.tfevents.1705596568.a40b44969e29.1174.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:306479452a4a06952e1242473604669c30f82e9c0c2bb8e5a39137db5a381979
3
+ size 7339
runs/Jan18_16-53-56_a40b44969e29/events.out.tfevents.1705596843.a40b44969e29.1174.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5774dd14f37978195244c1aef963d883aa46166c89b7f247bcf0a49c65dca017
3
+ size 34914
runs/Jan18_16-59-32_a40b44969e29/events.out.tfevents.1705597174.a40b44969e29.1174.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1025bd7071fa617adeca5ee189ee54376b59959b25ef938d56e4abb985f1ce27
3
+ size 4618
runs/Jan18_17-00-37_a40b44969e29/events.out.tfevents.1705597240.a40b44969e29.1174.4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a94a7f07757905764c6828dc52fb5866c976083b6e86894d12a21724f558dc3d
3
+ size 34914
runs/Jan18_17-12-52_a40b44969e29/events.out.tfevents.1705597975.a40b44969e29.1174.7 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f3ec52abc48efb912f751b90a09d361491e44abe910a2ad4beed863be149dce9
3
+ size 34914
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b5a0f0c1c75cdad616d777dccf93622aaea5b323e14698cae03a9183c4a2038b
3
  size 4664
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c69d5b8d8b003c357268567e9562dd940b057cadb53d245af5d8d59228fdb3f4
3
  size 4664