Commit
·
72bfe8b
1
Parent(s):
92e117a
End of training
Browse files
README.md
ADDED
@@ -0,0 +1,214 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: cc-by-nc-sa-4.0
|
3 |
+
tags:
|
4 |
+
- generated_from_trainer
|
5 |
+
metrics:
|
6 |
+
- precision
|
7 |
+
- recall
|
8 |
+
- f1
|
9 |
+
- accuracy
|
10 |
+
model-index:
|
11 |
+
- name: LayoutLMv3_1
|
12 |
+
results: []
|
13 |
+
---
|
14 |
+
|
15 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
16 |
+
should probably proofread and complete it, then remove this comment. -->
|
17 |
+
|
18 |
+
# LayoutLMv3_1
|
19 |
+
|
20 |
+
This model is a fine-tuned version of [microsoft/layoutlmv3-large](https://huggingface.co/microsoft/layoutlmv3-large) on an unknown dataset.
|
21 |
+
It achieves the following results on the evaluation set:
|
22 |
+
- Loss: 0.3772
|
23 |
+
- Precision: 0.7355
|
24 |
+
- Recall: 0.7550
|
25 |
+
- F1: 0.7451
|
26 |
+
- Accuracy: 0.9035
|
27 |
+
|
28 |
+
## Model description
|
29 |
+
|
30 |
+
More information needed
|
31 |
+
|
32 |
+
## Intended uses & limitations
|
33 |
+
|
34 |
+
More information needed
|
35 |
+
|
36 |
+
## Training and evaluation data
|
37 |
+
|
38 |
+
More information needed
|
39 |
+
|
40 |
+
## Training procedure
|
41 |
+
|
42 |
+
### Training hyperparameters
|
43 |
+
|
44 |
+
The following hyperparameters were used during training:
|
45 |
+
- learning_rate: 1e-06
|
46 |
+
- train_batch_size: 2
|
47 |
+
- eval_batch_size: 2
|
48 |
+
- seed: 42
|
49 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
+
- lr_scheduler_type: linear
|
51 |
+
- training_steps: 1500
|
52 |
+
|
53 |
+
### Training results
|
54 |
+
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|
56 |
+
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
|
57 |
+
| No log | 0.37 | 10 | 2.5352 | 0.0089 | 0.0199 | 0.0123 | 0.0503 |
|
58 |
+
| No log | 0.74 | 20 | 2.1493 | 0.0377 | 0.0397 | 0.0387 | 0.6028 |
|
59 |
+
| No log | 1.11 | 30 | 1.7234 | 0.0 | 0.0 | 0.0 | 0.7804 |
|
60 |
+
| No log | 1.48 | 40 | 1.2971 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
61 |
+
| No log | 1.85 | 50 | 1.0544 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
62 |
+
| No log | 2.22 | 60 | 1.0210 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
63 |
+
| No log | 2.59 | 70 | 0.9842 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
64 |
+
| No log | 2.96 | 80 | 0.9651 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
65 |
+
| No log | 3.33 | 90 | 0.9402 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
66 |
+
| No log | 3.7 | 100 | 0.9205 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
67 |
+
| No log | 4.07 | 110 | 0.9035 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
68 |
+
| No log | 4.44 | 120 | 0.8807 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
69 |
+
| No log | 4.81 | 130 | 0.8596 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
70 |
+
| No log | 5.19 | 140 | 0.8382 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
71 |
+
| No log | 5.56 | 150 | 0.8151 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
72 |
+
| No log | 5.93 | 160 | 0.8039 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
73 |
+
| No log | 6.3 | 170 | 0.7864 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
74 |
+
| No log | 6.67 | 180 | 0.7588 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
75 |
+
| No log | 7.04 | 190 | 0.7352 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
76 |
+
| No log | 7.41 | 200 | 0.7232 | 0.0 | 0.0 | 0.0 | 0.7818 |
|
77 |
+
| No log | 7.78 | 210 | 0.7246 | 1.0 | 0.0132 | 0.0261 | 0.7846 |
|
78 |
+
| No log | 8.15 | 220 | 0.6916 | 1.0 | 0.0132 | 0.0261 | 0.7846 |
|
79 |
+
| No log | 8.52 | 230 | 0.6755 | 1.0 | 0.0199 | 0.0390 | 0.7860 |
|
80 |
+
| No log | 8.89 | 240 | 0.6686 | 1.0 | 0.0331 | 0.0641 | 0.7888 |
|
81 |
+
| No log | 9.26 | 250 | 0.6514 | 1.0 | 0.0265 | 0.0516 | 0.7874 |
|
82 |
+
| No log | 9.63 | 260 | 0.6430 | 1.0 | 0.0596 | 0.1125 | 0.7944 |
|
83 |
+
| No log | 10.0 | 270 | 0.6280 | 0.9474 | 0.1192 | 0.2118 | 0.8070 |
|
84 |
+
| No log | 10.37 | 280 | 0.6127 | 0.9286 | 0.1722 | 0.2905 | 0.8196 |
|
85 |
+
| No log | 10.74 | 290 | 0.6136 | 0.9024 | 0.2450 | 0.3854 | 0.8350 |
|
86 |
+
| No log | 11.11 | 300 | 0.5886 | 0.8810 | 0.2450 | 0.3834 | 0.8350 |
|
87 |
+
| No log | 11.48 | 310 | 0.5896 | 0.8909 | 0.3245 | 0.4757 | 0.8517 |
|
88 |
+
| No log | 11.85 | 320 | 0.5732 | 0.9310 | 0.3576 | 0.5167 | 0.8587 |
|
89 |
+
| No log | 12.22 | 330 | 0.5770 | 0.8533 | 0.4238 | 0.5664 | 0.8671 |
|
90 |
+
| No log | 12.59 | 340 | 0.5557 | 0.8649 | 0.4238 | 0.5689 | 0.8671 |
|
91 |
+
| No log | 12.96 | 350 | 0.5469 | 0.8222 | 0.4901 | 0.6141 | 0.8797 |
|
92 |
+
| No log | 13.33 | 360 | 0.5412 | 0.8242 | 0.4967 | 0.6198 | 0.8825 |
|
93 |
+
| No log | 13.7 | 370 | 0.5313 | 0.8454 | 0.5430 | 0.6613 | 0.8923 |
|
94 |
+
| No log | 14.07 | 380 | 0.5188 | 0.8381 | 0.5828 | 0.6875 | 0.8993 |
|
95 |
+
| No log | 14.44 | 390 | 0.5190 | 0.8333 | 0.5960 | 0.6950 | 0.8993 |
|
96 |
+
| No log | 14.81 | 400 | 0.5172 | 0.8165 | 0.5894 | 0.6846 | 0.8993 |
|
97 |
+
| No log | 15.19 | 410 | 0.5066 | 0.7966 | 0.6225 | 0.6989 | 0.9021 |
|
98 |
+
| No log | 15.56 | 420 | 0.4879 | 0.8087 | 0.6159 | 0.6992 | 0.9035 |
|
99 |
+
| No log | 15.93 | 430 | 0.4943 | 0.7833 | 0.6225 | 0.6937 | 0.9021 |
|
100 |
+
| No log | 16.3 | 440 | 0.4716 | 0.8205 | 0.6358 | 0.7164 | 0.9091 |
|
101 |
+
| No log | 16.67 | 450 | 0.4594 | 0.8264 | 0.6623 | 0.7353 | 0.9119 |
|
102 |
+
| No log | 17.04 | 460 | 0.4758 | 0.7761 | 0.6887 | 0.7298 | 0.9105 |
|
103 |
+
| No log | 17.41 | 470 | 0.4520 | 0.8430 | 0.6755 | 0.75 | 0.9217 |
|
104 |
+
| No log | 17.78 | 480 | 0.4582 | 0.8244 | 0.7152 | 0.7660 | 0.9245 |
|
105 |
+
| No log | 18.15 | 490 | 0.4475 | 0.8189 | 0.6887 | 0.7482 | 0.9217 |
|
106 |
+
| 0.6982 | 18.52 | 500 | 0.4627 | 0.7431 | 0.7086 | 0.7254 | 0.9105 |
|
107 |
+
| 0.6982 | 18.89 | 510 | 0.4419 | 0.7826 | 0.7152 | 0.7474 | 0.9189 |
|
108 |
+
| 0.6982 | 19.26 | 520 | 0.4351 | 0.7730 | 0.7219 | 0.7466 | 0.9147 |
|
109 |
+
| 0.6982 | 19.63 | 530 | 0.4213 | 0.7857 | 0.7285 | 0.7560 | 0.9189 |
|
110 |
+
| 0.6982 | 20.0 | 540 | 0.4389 | 0.7273 | 0.7417 | 0.7344 | 0.9091 |
|
111 |
+
| 0.6982 | 20.37 | 550 | 0.4208 | 0.7762 | 0.7351 | 0.7551 | 0.9189 |
|
112 |
+
| 0.6982 | 20.74 | 560 | 0.4301 | 0.74 | 0.7351 | 0.7375 | 0.9119 |
|
113 |
+
| 0.6982 | 21.11 | 570 | 0.4199 | 0.7568 | 0.7417 | 0.7492 | 0.9161 |
|
114 |
+
| 0.6982 | 21.48 | 580 | 0.4283 | 0.7006 | 0.7285 | 0.7143 | 0.9021 |
|
115 |
+
| 0.6982 | 21.85 | 590 | 0.4068 | 0.7857 | 0.7285 | 0.7560 | 0.9203 |
|
116 |
+
| 0.6982 | 22.22 | 600 | 0.4241 | 0.7179 | 0.7417 | 0.7296 | 0.9077 |
|
117 |
+
| 0.6982 | 22.59 | 610 | 0.3988 | 0.8321 | 0.7550 | 0.7917 | 0.9329 |
|
118 |
+
| 0.6982 | 22.96 | 620 | 0.4005 | 0.7671 | 0.7417 | 0.7542 | 0.9189 |
|
119 |
+
| 0.6982 | 23.33 | 630 | 0.3939 | 0.7651 | 0.7550 | 0.76 | 0.9189 |
|
120 |
+
| 0.6982 | 23.7 | 640 | 0.4007 | 0.7278 | 0.7616 | 0.7443 | 0.9119 |
|
121 |
+
| 0.6982 | 24.07 | 650 | 0.3857 | 0.7973 | 0.7815 | 0.7893 | 0.9217 |
|
122 |
+
| 0.6982 | 24.44 | 660 | 0.3893 | 0.7682 | 0.7682 | 0.7682 | 0.9175 |
|
123 |
+
| 0.6982 | 24.81 | 670 | 0.3946 | 0.7516 | 0.7616 | 0.7566 | 0.9147 |
|
124 |
+
| 0.6982 | 25.19 | 680 | 0.3893 | 0.7516 | 0.7616 | 0.7566 | 0.9161 |
|
125 |
+
| 0.6982 | 25.56 | 690 | 0.3969 | 0.7419 | 0.7616 | 0.7516 | 0.9119 |
|
126 |
+
| 0.6982 | 25.93 | 700 | 0.3854 | 0.7852 | 0.7748 | 0.7800 | 0.9217 |
|
127 |
+
| 0.6982 | 26.3 | 710 | 0.3858 | 0.7973 | 0.7815 | 0.7893 | 0.9231 |
|
128 |
+
| 0.6982 | 26.67 | 720 | 0.3831 | 0.7867 | 0.7815 | 0.7841 | 0.9217 |
|
129 |
+
| 0.6982 | 27.04 | 730 | 0.3996 | 0.7267 | 0.7748 | 0.75 | 0.9049 |
|
130 |
+
| 0.6982 | 27.41 | 740 | 0.3907 | 0.7358 | 0.7748 | 0.7548 | 0.9077 |
|
131 |
+
| 0.6982 | 27.78 | 750 | 0.3720 | 0.8013 | 0.8013 | 0.8013 | 0.9245 |
|
132 |
+
| 0.6982 | 28.15 | 760 | 0.3799 | 0.7895 | 0.7947 | 0.7921 | 0.9189 |
|
133 |
+
| 0.6982 | 28.52 | 770 | 0.3938 | 0.7178 | 0.7748 | 0.7452 | 0.9035 |
|
134 |
+
| 0.6982 | 28.89 | 780 | 0.3761 | 0.7763 | 0.7815 | 0.7789 | 0.9189 |
|
135 |
+
| 0.6982 | 29.26 | 790 | 0.3906 | 0.7267 | 0.7748 | 0.75 | 0.9063 |
|
136 |
+
| 0.6982 | 29.63 | 800 | 0.3780 | 0.7436 | 0.7682 | 0.7557 | 0.9105 |
|
137 |
+
| 0.6982 | 30.0 | 810 | 0.3773 | 0.7548 | 0.7748 | 0.7647 | 0.9133 |
|
138 |
+
| 0.6982 | 30.37 | 820 | 0.3716 | 0.7727 | 0.7881 | 0.7803 | 0.9175 |
|
139 |
+
| 0.6982 | 30.74 | 830 | 0.3747 | 0.7452 | 0.7748 | 0.7597 | 0.9119 |
|
140 |
+
| 0.6982 | 31.11 | 840 | 0.3747 | 0.7405 | 0.7748 | 0.7573 | 0.9133 |
|
141 |
+
| 0.6982 | 31.48 | 850 | 0.3821 | 0.7239 | 0.7815 | 0.7516 | 0.9077 |
|
142 |
+
| 0.6982 | 31.85 | 860 | 0.3649 | 0.7697 | 0.7748 | 0.7723 | 0.9175 |
|
143 |
+
| 0.6982 | 32.22 | 870 | 0.3804 | 0.7152 | 0.7815 | 0.7468 | 0.9049 |
|
144 |
+
| 0.6982 | 32.59 | 880 | 0.3715 | 0.75 | 0.7748 | 0.7622 | 0.9105 |
|
145 |
+
| 0.6982 | 32.96 | 890 | 0.3663 | 0.7632 | 0.7682 | 0.7657 | 0.9161 |
|
146 |
+
| 0.6982 | 33.33 | 900 | 0.3713 | 0.7516 | 0.7815 | 0.7662 | 0.9133 |
|
147 |
+
| 0.6982 | 33.7 | 910 | 0.3684 | 0.7597 | 0.7748 | 0.7672 | 0.9133 |
|
148 |
+
| 0.6982 | 34.07 | 920 | 0.3708 | 0.75 | 0.7748 | 0.7622 | 0.9119 |
|
149 |
+
| 0.6982 | 34.44 | 930 | 0.3699 | 0.8146 | 0.8146 | 0.8146 | 0.9259 |
|
150 |
+
| 0.6982 | 34.81 | 940 | 0.3726 | 0.7778 | 0.7881 | 0.7829 | 0.9189 |
|
151 |
+
| 0.6982 | 35.19 | 950 | 0.3763 | 0.7405 | 0.7748 | 0.7573 | 0.9105 |
|
152 |
+
| 0.6982 | 35.56 | 960 | 0.3883 | 0.7267 | 0.7748 | 0.75 | 0.9035 |
|
153 |
+
| 0.6982 | 35.93 | 970 | 0.3729 | 0.7616 | 0.7616 | 0.7616 | 0.9119 |
|
154 |
+
| 0.6982 | 36.3 | 980 | 0.3654 | 0.8108 | 0.7947 | 0.8027 | 0.9217 |
|
155 |
+
| 0.6982 | 36.67 | 990 | 0.3795 | 0.7195 | 0.7815 | 0.7492 | 0.9049 |
|
156 |
+
| 0.2195 | 37.04 | 1000 | 0.3819 | 0.7267 | 0.7748 | 0.75 | 0.9035 |
|
157 |
+
| 0.2195 | 37.41 | 1010 | 0.3760 | 0.7233 | 0.7616 | 0.7419 | 0.9035 |
|
158 |
+
| 0.2195 | 37.78 | 1020 | 0.3664 | 0.7468 | 0.7616 | 0.7541 | 0.9105 |
|
159 |
+
| 0.2195 | 38.15 | 1030 | 0.3753 | 0.7312 | 0.7748 | 0.7524 | 0.9077 |
|
160 |
+
| 0.2195 | 38.52 | 1040 | 0.3791 | 0.7284 | 0.7815 | 0.7540 | 0.9035 |
|
161 |
+
| 0.2195 | 38.89 | 1050 | 0.3665 | 0.7933 | 0.7881 | 0.7907 | 0.9203 |
|
162 |
+
| 0.2195 | 39.26 | 1060 | 0.3655 | 0.7763 | 0.7815 | 0.7789 | 0.9161 |
|
163 |
+
| 0.2195 | 39.63 | 1070 | 0.3811 | 0.7312 | 0.7748 | 0.7524 | 0.9091 |
|
164 |
+
| 0.2195 | 40.0 | 1080 | 0.3725 | 0.7342 | 0.7682 | 0.7508 | 0.9063 |
|
165 |
+
| 0.2195 | 40.37 | 1090 | 0.3639 | 0.7692 | 0.7947 | 0.7818 | 0.9161 |
|
166 |
+
| 0.2195 | 40.74 | 1100 | 0.3721 | 0.7312 | 0.7748 | 0.7524 | 0.9077 |
|
167 |
+
| 0.2195 | 41.11 | 1110 | 0.3782 | 0.7143 | 0.7616 | 0.7372 | 0.9021 |
|
168 |
+
| 0.2195 | 41.48 | 1120 | 0.3654 | 0.7748 | 0.7748 | 0.7748 | 0.9189 |
|
169 |
+
| 0.2195 | 41.85 | 1130 | 0.3717 | 0.7278 | 0.7616 | 0.7443 | 0.9049 |
|
170 |
+
| 0.2195 | 42.22 | 1140 | 0.3868 | 0.7195 | 0.7815 | 0.7492 | 0.9021 |
|
171 |
+
| 0.2195 | 42.59 | 1150 | 0.3913 | 0.7066 | 0.7815 | 0.7421 | 0.8993 |
|
172 |
+
| 0.2195 | 42.96 | 1160 | 0.3797 | 0.7222 | 0.7748 | 0.7476 | 0.9035 |
|
173 |
+
| 0.2195 | 43.33 | 1170 | 0.3709 | 0.7405 | 0.7748 | 0.7573 | 0.9105 |
|
174 |
+
| 0.2195 | 43.7 | 1180 | 0.3736 | 0.7358 | 0.7748 | 0.7548 | 0.9077 |
|
175 |
+
| 0.2195 | 44.07 | 1190 | 0.3664 | 0.7389 | 0.7682 | 0.7532 | 0.9091 |
|
176 |
+
| 0.2195 | 44.44 | 1200 | 0.3677 | 0.7358 | 0.7748 | 0.7548 | 0.9063 |
|
177 |
+
| 0.2195 | 44.81 | 1210 | 0.3805 | 0.7329 | 0.7815 | 0.7564 | 0.9077 |
|
178 |
+
| 0.2195 | 45.19 | 1220 | 0.3806 | 0.7329 | 0.7815 | 0.7564 | 0.9077 |
|
179 |
+
| 0.2195 | 45.56 | 1230 | 0.3712 | 0.7372 | 0.7616 | 0.7492 | 0.9035 |
|
180 |
+
| 0.2195 | 45.93 | 1240 | 0.3746 | 0.7308 | 0.7550 | 0.7427 | 0.9035 |
|
181 |
+
| 0.2195 | 46.3 | 1250 | 0.3725 | 0.7261 | 0.7550 | 0.7403 | 0.9049 |
|
182 |
+
| 0.2195 | 46.67 | 1260 | 0.3719 | 0.7355 | 0.7550 | 0.7451 | 0.9035 |
|
183 |
+
| 0.2195 | 47.04 | 1270 | 0.3718 | 0.7355 | 0.7550 | 0.7451 | 0.9063 |
|
184 |
+
| 0.2195 | 47.41 | 1280 | 0.3728 | 0.7355 | 0.7550 | 0.7451 | 0.9063 |
|
185 |
+
| 0.2195 | 47.78 | 1290 | 0.3740 | 0.7261 | 0.7550 | 0.7403 | 0.9035 |
|
186 |
+
| 0.2195 | 48.15 | 1300 | 0.3780 | 0.7325 | 0.7616 | 0.7468 | 0.9035 |
|
187 |
+
| 0.2195 | 48.52 | 1310 | 0.3796 | 0.7325 | 0.7616 | 0.7468 | 0.9035 |
|
188 |
+
| 0.2195 | 48.89 | 1320 | 0.3816 | 0.7325 | 0.7616 | 0.7468 | 0.9035 |
|
189 |
+
| 0.2195 | 49.26 | 1330 | 0.3816 | 0.7278 | 0.7616 | 0.7443 | 0.9021 |
|
190 |
+
| 0.2195 | 49.63 | 1340 | 0.3803 | 0.7278 | 0.7616 | 0.7443 | 0.9021 |
|
191 |
+
| 0.2195 | 50.0 | 1350 | 0.3777 | 0.7308 | 0.7550 | 0.7427 | 0.9049 |
|
192 |
+
| 0.2195 | 50.37 | 1360 | 0.3810 | 0.7325 | 0.7616 | 0.7468 | 0.9035 |
|
193 |
+
| 0.2195 | 50.74 | 1370 | 0.3793 | 0.7325 | 0.7616 | 0.7468 | 0.9063 |
|
194 |
+
| 0.2195 | 51.11 | 1380 | 0.3773 | 0.7308 | 0.7550 | 0.7427 | 0.9049 |
|
195 |
+
| 0.2195 | 51.48 | 1390 | 0.3791 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
196 |
+
| 0.2195 | 51.85 | 1400 | 0.3822 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
197 |
+
| 0.2195 | 52.22 | 1410 | 0.3830 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
198 |
+
| 0.2195 | 52.59 | 1420 | 0.3797 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
199 |
+
| 0.2195 | 52.96 | 1430 | 0.3791 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
200 |
+
| 0.2195 | 53.33 | 1440 | 0.3790 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
201 |
+
| 0.2195 | 53.7 | 1450 | 0.3792 | 0.7342 | 0.7682 | 0.7508 | 0.9049 |
|
202 |
+
| 0.2195 | 54.07 | 1460 | 0.3786 | 0.7325 | 0.7616 | 0.7468 | 0.9035 |
|
203 |
+
| 0.2195 | 54.44 | 1470 | 0.3778 | 0.7355 | 0.7550 | 0.7451 | 0.9035 |
|
204 |
+
| 0.2195 | 54.81 | 1480 | 0.3776 | 0.7355 | 0.7550 | 0.7451 | 0.9035 |
|
205 |
+
| 0.2195 | 55.19 | 1490 | 0.3774 | 0.7355 | 0.7550 | 0.7451 | 0.9035 |
|
206 |
+
| 0.1305 | 55.56 | 1500 | 0.3772 | 0.7355 | 0.7550 | 0.7451 | 0.9035 |
|
207 |
+
|
208 |
+
|
209 |
+
### Framework versions
|
210 |
+
|
211 |
+
- Transformers 4.29.2
|
212 |
+
- Pytorch 2.0.1+cu118
|
213 |
+
- Datasets 2.14.4
|
214 |
+
- Tokenizers 0.13.3
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1428419377
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:16807b20cc5f17c36f5edf9bad0e9e9f90745348592c16961f6bec38804eb8ce
|
3 |
size 1428419377
|
runs/Aug30_18-49-01_73d32e6ae637/events.out.tfevents.1693421366.73d32e6ae637.573.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:cb83ebafcc84376e6fec323b666aadeb399b8eac95f49d9aa91e3e3bce781845
|
3 |
+
size 76337
|