pranaydeeps commited on
Commit
21cd340
1 Parent(s): c94b23c

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,108 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - precision
7
+ - recall
8
+ - f1
9
+ - accuracy
10
+ model-index:
11
+ - name: pos_final_mono_nl
12
+ results: []
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+ # pos_final_mono_nl
19
+
20
+ This model is a fine-tuned version of [pdelobelle/robbert-v2-dutch-base](https://huggingface.co/pdelobelle/robbert-v2-dutch-base) on the None dataset.
21
+ It achieves the following results on the evaluation set:
22
+ - Loss: 0.1115
23
+ - Precision: 0.9783
24
+ - Recall: 0.9784
25
+ - F1: 0.9783
26
+ - Accuracy: 0.9791
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 5e-05
46
+ - train_batch_size: 256
47
+ - eval_batch_size: 256
48
+ - seed: 42
49
+ - gradient_accumulation_steps: 4
50
+ - total_train_batch_size: 1024
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: linear
53
+ - lr_scheduler_warmup_steps: 500
54
+ - num_epochs: 40.0
55
+ - mixed_precision_training: Native AMP
56
+
57
+ ### Training results
58
+
59
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
60
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
61
+ | No log | 1.0 | 69 | 3.7703 | 0.2597 | 0.1252 | 0.1689 | 0.2575 |
62
+ | No log | 2.0 | 138 | 1.0148 | 0.8058 | 0.8008 | 0.8033 | 0.8066 |
63
+ | No log | 3.0 | 207 | 0.3402 | 0.9302 | 0.9278 | 0.9290 | 0.9299 |
64
+ | No log | 4.0 | 276 | 0.2016 | 0.9559 | 0.9551 | 0.9555 | 0.9561 |
65
+ | No log | 5.0 | 345 | 0.1486 | 0.9643 | 0.9638 | 0.9641 | 0.9648 |
66
+ | No log | 6.0 | 414 | 0.1206 | 0.9697 | 0.9696 | 0.9697 | 0.9702 |
67
+ | No log | 7.0 | 483 | 0.1063 | 0.9720 | 0.9719 | 0.9720 | 0.9727 |
68
+ | 1.2192 | 8.0 | 552 | 0.0983 | 0.9734 | 0.9735 | 0.9735 | 0.9742 |
69
+ | 1.2192 | 9.0 | 621 | 0.0947 | 0.9746 | 0.9747 | 0.9746 | 0.9754 |
70
+ | 1.2192 | 10.0 | 690 | 0.0913 | 0.9753 | 0.9755 | 0.9754 | 0.9761 |
71
+ | 1.2192 | 11.0 | 759 | 0.0885 | 0.9761 | 0.9763 | 0.9762 | 0.9770 |
72
+ | 1.2192 | 12.0 | 828 | 0.0877 | 0.9764 | 0.9765 | 0.9764 | 0.9772 |
73
+ | 1.2192 | 13.0 | 897 | 0.0878 | 0.9767 | 0.9769 | 0.9768 | 0.9775 |
74
+ | 1.2192 | 14.0 | 966 | 0.0873 | 0.9767 | 0.9769 | 0.9768 | 0.9776 |
75
+ | 0.0688 | 15.0 | 1035 | 0.0877 | 0.9771 | 0.9773 | 0.9772 | 0.9779 |
76
+ | 0.0688 | 16.0 | 1104 | 0.0878 | 0.9773 | 0.9774 | 0.9773 | 0.9781 |
77
+ | 0.0688 | 17.0 | 1173 | 0.0897 | 0.9772 | 0.9773 | 0.9773 | 0.9781 |
78
+ | 0.0688 | 18.0 | 1242 | 0.0909 | 0.9775 | 0.9776 | 0.9776 | 0.9783 |
79
+ | 0.0688 | 19.0 | 1311 | 0.0917 | 0.9776 | 0.9778 | 0.9777 | 0.9785 |
80
+ | 0.0688 | 20.0 | 1380 | 0.0924 | 0.9778 | 0.9780 | 0.9779 | 0.9787 |
81
+ | 0.0688 | 21.0 | 1449 | 0.0949 | 0.9777 | 0.9779 | 0.9778 | 0.9785 |
82
+ | 0.0366 | 22.0 | 1518 | 0.0956 | 0.9776 | 0.9777 | 0.9777 | 0.9784 |
83
+ | 0.0366 | 23.0 | 1587 | 0.0962 | 0.9778 | 0.9780 | 0.9779 | 0.9786 |
84
+ | 0.0366 | 24.0 | 1656 | 0.0992 | 0.9777 | 0.9780 | 0.9779 | 0.9786 |
85
+ | 0.0366 | 25.0 | 1725 | 0.0999 | 0.9779 | 0.9781 | 0.9780 | 0.9787 |
86
+ | 0.0366 | 26.0 | 1794 | 0.1007 | 0.9780 | 0.9782 | 0.9781 | 0.9789 |
87
+ | 0.0366 | 27.0 | 1863 | 0.1022 | 0.9781 | 0.9782 | 0.9782 | 0.9789 |
88
+ | 0.0366 | 28.0 | 1932 | 0.1030 | 0.9781 | 0.9783 | 0.9782 | 0.9790 |
89
+ | 0.0226 | 29.0 | 2001 | 0.1055 | 0.9781 | 0.9782 | 0.9781 | 0.9789 |
90
+ | 0.0226 | 30.0 | 2070 | 0.1057 | 0.9780 | 0.9782 | 0.9781 | 0.9789 |
91
+ | 0.0226 | 31.0 | 2139 | 0.1067 | 0.9780 | 0.9781 | 0.9780 | 0.9788 |
92
+ | 0.0226 | 32.0 | 2208 | 0.1077 | 0.9780 | 0.9782 | 0.9781 | 0.9789 |
93
+ | 0.0226 | 33.0 | 2277 | 0.1085 | 0.9780 | 0.9781 | 0.9781 | 0.9789 |
94
+ | 0.0226 | 34.0 | 2346 | 0.1094 | 0.9781 | 0.9782 | 0.9781 | 0.9789 |
95
+ | 0.0226 | 35.0 | 2415 | 0.1095 | 0.9783 | 0.9784 | 0.9783 | 0.9791 |
96
+ | 0.0226 | 36.0 | 2484 | 0.1101 | 0.9780 | 0.9782 | 0.9781 | 0.9789 |
97
+ | 0.0159 | 37.0 | 2553 | 0.1114 | 0.9782 | 0.9784 | 0.9783 | 0.9791 |
98
+ | 0.0159 | 38.0 | 2622 | 0.1111 | 0.9782 | 0.9784 | 0.9783 | 0.9791 |
99
+ | 0.0159 | 39.0 | 2691 | 0.1114 | 0.9782 | 0.9784 | 0.9783 | 0.9791 |
100
+ | 0.0159 | 40.0 | 2760 | 0.1115 | 0.9783 | 0.9784 | 0.9783 | 0.9791 |
101
+
102
+
103
+ ### Framework versions
104
+
105
+ - Transformers 4.25.1
106
+ - Pytorch 1.12.0
107
+ - Datasets 2.18.0
108
+ - Tokenizers 0.13.2
all_results.json ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 40.0,
3
+ "eval_accuracy": 0.9791272496102441,
4
+ "eval_f1": 0.9783398772157638,
5
+ "eval_loss": 0.1115424633026123,
6
+ "eval_precision": 0.9782571951013384,
7
+ "eval_recall": 0.978422573307924,
8
+ "eval_runtime": 10.375,
9
+ "eval_samples": 2619,
10
+ "eval_samples_per_second": 758.46,
11
+ "eval_steps_per_second": 2.988,
12
+ "train_loss": 0.24823836001796998,
13
+ "train_runtime": 2048.5615,
14
+ "train_samples": 70812,
15
+ "train_samples_per_second": 1382.668,
16
+ "train_steps_per_second": 1.347
17
+ }
config.json ADDED
@@ -0,0 +1,360 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "pdelobelle/robbert-v2-dutch-base",
3
+ "architectures": [
4
+ "RobertaForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "finetuning_task": "pos",
11
+ "gradient_checkpointing": false,
12
+ "hidden_act": "gelu",
13
+ "hidden_dropout_prob": 0.1,
14
+ "hidden_size": 768,
15
+ "id2label": {
16
+ "0": "",
17
+ "1": "ADJ(postnom,basis,met-s)",
18
+ "2": "VNW(onbep,grad,basis)",
19
+ "3": "VNW(pers,pron,3m,ev)",
20
+ "4": "BW()",
21
+ "5": "ADJ(nom,sup,met-e,mv-n)",
22
+ "6": "VNW(pers,pron,3,mv)",
23
+ "7": "VNW(vb,pron,3v,ev)",
24
+ "8": "VG(onder)",
25
+ "9": "N(soort,ev,basis,onz,stan)",
26
+ "10": "VNW(pers,pron,1,mv)",
27
+ "11": "VNW(pers,pron,3,ev,masc)",
28
+ "12": "TW(rang,nom,zonder-n)",
29
+ "13": "TSW()",
30
+ "14": "#not\t#",
31
+ "15": "WW(vd,nom,met-e,mv-n)",
32
+ "16": "ADJ(postnom,comp,zonder)",
33
+ "17": "TW(hoofd,nom,mv-n,basis)",
34
+ "18": "LID(bep)",
35
+ "19": "VNW(aanw,pron,3o,ev)",
36
+ "20": "N(eigen,mv,dim)",
37
+ "21": "SPEC(deeleigen)",
38
+ "22": "VNW(excl,pron,3,getal)",
39
+ "23": "WW(vd,prenom,met-e)",
40
+ "24": "VNW(refl,pron,3,getal)",
41
+ "25": "VNW(pers,pron,3,ev,onz)",
42
+ "26": "WW(inf,vrij,zonder)",
43
+ "27": "VNW(pers,pron,1,ev)",
44
+ "28": "ADJ(vrij,dim,zonder)",
45
+ "29": "TW(rang,nom,mv-n)",
46
+ "30": "VNW(vb,det)",
47
+ "31": "TW(hoofd,prenom,stan)",
48
+ "32": "SPEC(symb)",
49
+ "33": "VNW(betr,pron,3,ev)",
50
+ "34": "U",
51
+ "35": "WW(pv,conj,ev)",
52
+ "36": "N(soort,ev,dim,onz,stan)",
53
+ "37": "N(soort,ev,basis,zijd,stan)",
54
+ "38": "ADJ(prenom,comp,met-e,stan)",
55
+ "39": "zonder-n)",
56
+ "40": "ADJ(vrij,verder,zonder)",
57
+ "41": "N(eigen,ev,basis,onz,stan)",
58
+ "42": "N(eigen,ev,basis,gen)",
59
+ "43": "VNW(pr,pron,2,getal)",
60
+ "44": "@",
61
+ "45": "VNW(vb,pron,3m,ev)",
62
+ "46": "VNW(pers,pron,2,getal)",
63
+ "47": "VNW(bez,det,1,mv)",
64
+ "48": "N(soort,mv,dim)",
65
+ "49": "VZ(fin)",
66
+ "50": "WW(pv,tgw,mv)",
67
+ "51": "ADJ(nom,basis,zonder,zonder-n)",
68
+ "52": "VNW(aanw,adv-pron,3,getal)",
69
+ "53": "VNW(bez,det,3v,ev)",
70
+ "54": "TW(rang,prenom,stan)",
71
+ "55": "WW(inf,prenom,met-e)",
72
+ "56": "ADJ(nom,basis,met-e,zonder-n,bijz)",
73
+ "57": "Boulevard\tN(eigen,ev,basis,genus,stan)",
74
+ "58": "WW(od,nom,met-e,zonder-n)",
75
+ "59": "ADJ(nom,sup,met-e,zonder-n,stan)",
76
+ "60": "VNW(pers,pron,3,getal,fem)",
77
+ "61": "VNW(bez,det,3,mv)",
78
+ "62": "VNW(pers,pron,3m,ev,masc)",
79
+ "63": "VG(neven)",
80
+ "64": "VNW(recip,pron,persoon,mv)",
81
+ "65": "ADJ(nom,basis,zonder,mv-n)",
82
+ "66": "VNW(bez,det,3,ev)",
83
+ "67": "WW(od,nom,met-e,mv-n)",
84
+ "68": "VNW(vb,pron,3p,mv)",
85
+ "69": "VNW(onbep,adv-pron,3o,getal)",
86
+ "70": "ADJ(nom,comp,met-e,zonder-n,stan)",
87
+ "71": "ADJ(nom,basis,met-e,mv-n)",
88
+ "72": "N(eigen,ev,basis,zijd,stan)",
89
+ "73": "#",
90
+ "74": "VNW(aanw,adv-pron,3o,getal)",
91
+ "75": "VNW(bez,det,3p,mv)",
92
+ "76": "FW",
93
+ "77": "ADJ(nom,sup,zonder,zonder-n)",
94
+ "78": "VNW(bez,det,1,ev)",
95
+ "79": "VNW(pers,pron,3,ev,fem)",
96
+ "80": "WW(inf,prenom,zonder)",
97
+ "81": "WW(pv,tgw,met-t)",
98
+ "82": "VNW(pers,pron,2v,ev)",
99
+ "83": "VNW(pers,pron,3v,ev,fem)",
100
+ "84": "ADJ(nom,comp,met-e,mv-n)",
101
+ "85": "VNW(bez,det,2,getal)",
102
+ "86": "WW(pv,tegw,ev)",
103
+ "87": "VNW(pers,pron,3p,ev,masc)",
104
+ "88": "VNW(pr,pron,1,ev)",
105
+ "89": "VNW(onbep,det)",
106
+ "90": "N(eigen,mv,basis)",
107
+ "91": "VNW(aanw,det)",
108
+ "92": "VNW(bez,det,3m,ev)",
109
+ "93": "VZ(versm)",
110
+ "94": "N(soort,ev,basis,genus,stan)",
111
+ "95": "VNW(betr,det)",
112
+ "96": "TW(rang,prenom,bijz)",
113
+ "97": "TW(hoofd,nom,zonder-n,dim)",
114
+ "98": "VNW(pr,pron,1,mv)",
115
+ "99": "SPEC(afgebr)",
116
+ "100": "VNW(bez,det,2,mv)",
117
+ "101": "VNW(pers,pron,2v,mv)",
118
+ "102": "VNW(onbep,adv-pron,3,getal)",
119
+ "103": "TW(hoofd,nom,zonder,zonder-n)",
120
+ "104": "ADJ(postnom,basis,zonder)",
121
+ "105": "WW(pv,verl,mv)",
122
+ "106": "VNW(vb,pron,3p,getal)",
123
+ "107": "LET()",
124
+ "108": "ADJ(prenom,basis,zonder)",
125
+ "109": "ADJ(vrij,basis,zonder)",
126
+ "110": "ADJ(prenom,sup,zonder)",
127
+ "111": "N(soort,mv,basis)",
128
+ "112": "VNW(onbep,grad,sup)",
129
+ "113": "#NS\t#",
130
+ "114": "VNW(aanw,pron,3,getal)",
131
+ "115": "WW(vd,nom,met-e,zonder-n)",
132
+ "116": "~",
133
+ "117": "TW(hoofd,prenom,bijz)",
134
+ "118": "SPEC(vreemd)",
135
+ "119": "ADJ(vrij,sup,zonder)",
136
+ "120": "WW(od,prenom,met-e)",
137
+ "121": "ADJ(postnom,comp,met-s)",
138
+ "122": "TW(hoofd,vrij)",
139
+ "123": "VNW(bez,det,2v,ev)",
140
+ "124": "ADJ(prenom,basis,met-e,bijz)",
141
+ "125": "N(eigen,ev,basis,genus,stan)",
142
+ "126": "ADJ(vrij,comp,zonder)",
143
+ "127": "N(eigen,ev,dim,onz,stan)",
144
+ "128": "WW(inf,nom,zonder,zonder-n)",
145
+ "129": "WW(od,prenom,zonder)",
146
+ "130": "ADJ(prenom,sup,met-e,stan)",
147
+ "131": "SPEC(meta)",
148
+ "132": "VNW(pers,pron,3v,getal,fem)",
149
+ "133": "SPEC(enof)",
150
+ "134": "WW(vd,prenom,zonder)",
151
+ "135": "Jan",
152
+ "136": "WW(vd,vrij,zonder)",
153
+ "137": "VNW(aanw,pron,3m,ev)",
154
+ "138": "TW(hoofd,nom,zonder-n,basis)",
155
+ "139": "ADJ(prenom,basis,met-e,stan)",
156
+ "140": "ADJ(nom,basis,met-e,zonder-n,stan)",
157
+ "141": "SPEC(afk)",
158
+ "142": "N(soort,mv,basis,zijd,stan)",
159
+ "143": "VNW(vb,pron,3o,ev)",
160
+ "144": "LID(onbep)",
161
+ "145": "ADJ(prenom,comp,zonder)",
162
+ "146": "VNW(onbep,pron,3p,ev)",
163
+ "147": "VNW(onbep,pron,3o,ev)",
164
+ "148": "VGW()",
165
+ "149": "N(soort,ev,basis,dat)",
166
+ "150": "WW(pv,tgw,ev)",
167
+ "151": "VNW(pr,pron,2v,getal)",
168
+ "152": "WW(pv,verl,ev)",
169
+ "153": "VNW(vb,adv-pron,3o,getal)",
170
+ "154": "VNW(pers,pron,3p,mv)",
171
+ "155": "WW(od,vrij,zonder)",
172
+ "156": "VNW(pers,pron,2b,getal)",
173
+ "157": "VZ(init)",
174
+ "158": "VNW(bez,det,2v,mv)",
175
+ "159": "ADJ(prenom,basis,zonder,stan)",
176
+ "160": "VNW(onbep,grad,comp)",
177
+ "161": "N(soort,ev,basis,gen)",
178
+ "162": "VNW(betr,pron,persoon,getal)"
179
+ },
180
+ "initializer_range": 0.02,
181
+ "intermediate_size": 3072,
182
+ "label2id": {
183
+ "": 0,
184
+ "#": 73,
185
+ "#NS\t#": 113,
186
+ "#not\t#": 14,
187
+ "@": 44,
188
+ "ADJ(nom,basis,met-e,mv-n)": 71,
189
+ "ADJ(nom,basis,met-e,zonder-n,bijz)": 56,
190
+ "ADJ(nom,basis,met-e,zonder-n,stan)": 140,
191
+ "ADJ(nom,basis,zonder,mv-n)": 65,
192
+ "ADJ(nom,basis,zonder,zonder-n)": 51,
193
+ "ADJ(nom,comp,met-e,mv-n)": 84,
194
+ "ADJ(nom,comp,met-e,zonder-n,stan)": 70,
195
+ "ADJ(nom,sup,met-e,mv-n)": 5,
196
+ "ADJ(nom,sup,met-e,zonder-n,stan)": 59,
197
+ "ADJ(nom,sup,zonder,zonder-n)": 77,
198
+ "ADJ(postnom,basis,met-s)": 1,
199
+ "ADJ(postnom,basis,zonder)": 104,
200
+ "ADJ(postnom,comp,met-s)": 121,
201
+ "ADJ(postnom,comp,zonder)": 16,
202
+ "ADJ(prenom,basis,met-e,bijz)": 124,
203
+ "ADJ(prenom,basis,met-e,stan)": 139,
204
+ "ADJ(prenom,basis,zonder)": 108,
205
+ "ADJ(prenom,basis,zonder,stan)": 159,
206
+ "ADJ(prenom,comp,met-e,stan)": 38,
207
+ "ADJ(prenom,comp,zonder)": 145,
208
+ "ADJ(prenom,sup,met-e,stan)": 130,
209
+ "ADJ(prenom,sup,zonder)": 110,
210
+ "ADJ(vrij,basis,zonder)": 109,
211
+ "ADJ(vrij,comp,zonder)": 126,
212
+ "ADJ(vrij,dim,zonder)": 28,
213
+ "ADJ(vrij,sup,zonder)": 119,
214
+ "ADJ(vrij,verder,zonder)": 40,
215
+ "BW()": 4,
216
+ "Boulevard\tN(eigen,ev,basis,genus,stan)": 57,
217
+ "FW": 76,
218
+ "Jan": 135,
219
+ "LET()": 107,
220
+ "LID(bep)": 18,
221
+ "LID(onbep)": 144,
222
+ "N(eigen,ev,basis,gen)": 42,
223
+ "N(eigen,ev,basis,genus,stan)": 125,
224
+ "N(eigen,ev,basis,onz,stan)": 41,
225
+ "N(eigen,ev,basis,zijd,stan)": 72,
226
+ "N(eigen,ev,dim,onz,stan)": 127,
227
+ "N(eigen,mv,basis)": 90,
228
+ "N(eigen,mv,dim)": 20,
229
+ "N(soort,ev,basis,dat)": 149,
230
+ "N(soort,ev,basis,gen)": 161,
231
+ "N(soort,ev,basis,genus,stan)": 94,
232
+ "N(soort,ev,basis,onz,stan)": 9,
233
+ "N(soort,ev,basis,zijd,stan)": 37,
234
+ "N(soort,ev,dim,onz,stan)": 36,
235
+ "N(soort,mv,basis)": 111,
236
+ "N(soort,mv,basis,zijd,stan)": 142,
237
+ "N(soort,mv,dim)": 48,
238
+ "SPEC(afgebr)": 99,
239
+ "SPEC(afk)": 141,
240
+ "SPEC(deeleigen)": 21,
241
+ "SPEC(enof)": 133,
242
+ "SPEC(meta)": 131,
243
+ "SPEC(symb)": 32,
244
+ "SPEC(vreemd)": 118,
245
+ "TSW()": 13,
246
+ "TW(hoofd,nom,mv-n,basis)": 17,
247
+ "TW(hoofd,nom,zonder,zonder-n)": 103,
248
+ "TW(hoofd,nom,zonder-n,basis)": 138,
249
+ "TW(hoofd,nom,zonder-n,dim)": 97,
250
+ "TW(hoofd,prenom,bijz)": 117,
251
+ "TW(hoofd,prenom,stan)": 31,
252
+ "TW(hoofd,vrij)": 122,
253
+ "TW(rang,nom,mv-n)": 29,
254
+ "TW(rang,nom,zonder-n)": 12,
255
+ "TW(rang,prenom,bijz)": 96,
256
+ "TW(rang,prenom,stan)": 54,
257
+ "U": 34,
258
+ "VG(neven)": 63,
259
+ "VG(onder)": 8,
260
+ "VGW()": 148,
261
+ "VNW(aanw,adv-pron,3,getal)": 52,
262
+ "VNW(aanw,adv-pron,3o,getal)": 74,
263
+ "VNW(aanw,det)": 91,
264
+ "VNW(aanw,pron,3,getal)": 114,
265
+ "VNW(aanw,pron,3m,ev)": 137,
266
+ "VNW(aanw,pron,3o,ev)": 19,
267
+ "VNW(betr,det)": 95,
268
+ "VNW(betr,pron,3,ev)": 33,
269
+ "VNW(betr,pron,persoon,getal)": 162,
270
+ "VNW(bez,det,1,ev)": 78,
271
+ "VNW(bez,det,1,mv)": 47,
272
+ "VNW(bez,det,2,getal)": 85,
273
+ "VNW(bez,det,2,mv)": 100,
274
+ "VNW(bez,det,2v,ev)": 123,
275
+ "VNW(bez,det,2v,mv)": 158,
276
+ "VNW(bez,det,3,ev)": 66,
277
+ "VNW(bez,det,3,mv)": 61,
278
+ "VNW(bez,det,3m,ev)": 92,
279
+ "VNW(bez,det,3p,mv)": 75,
280
+ "VNW(bez,det,3v,ev)": 53,
281
+ "VNW(excl,pron,3,getal)": 22,
282
+ "VNW(onbep,adv-pron,3,getal)": 102,
283
+ "VNW(onbep,adv-pron,3o,getal)": 69,
284
+ "VNW(onbep,det)": 89,
285
+ "VNW(onbep,grad,basis)": 2,
286
+ "VNW(onbep,grad,comp)": 160,
287
+ "VNW(onbep,grad,sup)": 112,
288
+ "VNW(onbep,pron,3o,ev)": 147,
289
+ "VNW(onbep,pron,3p,ev)": 146,
290
+ "VNW(pers,pron,1,ev)": 27,
291
+ "VNW(pers,pron,1,mv)": 10,
292
+ "VNW(pers,pron,2,getal)": 46,
293
+ "VNW(pers,pron,2b,getal)": 156,
294
+ "VNW(pers,pron,2v,ev)": 82,
295
+ "VNW(pers,pron,2v,mv)": 101,
296
+ "VNW(pers,pron,3,ev,fem)": 79,
297
+ "VNW(pers,pron,3,ev,masc)": 11,
298
+ "VNW(pers,pron,3,ev,onz)": 25,
299
+ "VNW(pers,pron,3,getal,fem)": 60,
300
+ "VNW(pers,pron,3,mv)": 6,
301
+ "VNW(pers,pron,3m,ev)": 3,
302
+ "VNW(pers,pron,3m,ev,masc)": 62,
303
+ "VNW(pers,pron,3p,ev,masc)": 87,
304
+ "VNW(pers,pron,3p,mv)": 154,
305
+ "VNW(pers,pron,3v,ev,fem)": 83,
306
+ "VNW(pers,pron,3v,getal,fem)": 132,
307
+ "VNW(pr,pron,1,ev)": 88,
308
+ "VNW(pr,pron,1,mv)": 98,
309
+ "VNW(pr,pron,2,getal)": 43,
310
+ "VNW(pr,pron,2v,getal)": 151,
311
+ "VNW(recip,pron,persoon,mv)": 64,
312
+ "VNW(refl,pron,3,getal)": 24,
313
+ "VNW(vb,adv-pron,3o,getal)": 153,
314
+ "VNW(vb,det)": 30,
315
+ "VNW(vb,pron,3m,ev)": 45,
316
+ "VNW(vb,pron,3o,ev)": 143,
317
+ "VNW(vb,pron,3p,getal)": 106,
318
+ "VNW(vb,pron,3p,mv)": 68,
319
+ "VNW(vb,pron,3v,ev)": 7,
320
+ "VZ(fin)": 49,
321
+ "VZ(init)": 157,
322
+ "VZ(versm)": 93,
323
+ "WW(inf,nom,zonder,zonder-n)": 128,
324
+ "WW(inf,prenom,met-e)": 55,
325
+ "WW(inf,prenom,zonder)": 80,
326
+ "WW(inf,vrij,zonder)": 26,
327
+ "WW(od,nom,met-e,mv-n)": 67,
328
+ "WW(od,nom,met-e,zonder-n)": 58,
329
+ "WW(od,prenom,met-e)": 120,
330
+ "WW(od,prenom,zonder)": 129,
331
+ "WW(od,vrij,zonder)": 155,
332
+ "WW(pv,conj,ev)": 35,
333
+ "WW(pv,tegw,ev)": 86,
334
+ "WW(pv,tgw,ev)": 150,
335
+ "WW(pv,tgw,met-t)": 81,
336
+ "WW(pv,tgw,mv)": 50,
337
+ "WW(pv,verl,ev)": 152,
338
+ "WW(pv,verl,mv)": 105,
339
+ "WW(vd,nom,met-e,mv-n)": 15,
340
+ "WW(vd,nom,met-e,zonder-n)": 115,
341
+ "WW(vd,prenom,met-e)": 23,
342
+ "WW(vd,prenom,zonder)": 134,
343
+ "WW(vd,vrij,zonder)": 136,
344
+ "zonder-n)": 39,
345
+ "~": 116
346
+ },
347
+ "layer_norm_eps": 1e-05,
348
+ "max_position_embeddings": 514,
349
+ "model_type": "roberta",
350
+ "num_attention_heads": 12,
351
+ "num_hidden_layers": 12,
352
+ "output_past": true,
353
+ "pad_token_id": 1,
354
+ "position_embedding_type": "absolute",
355
+ "torch_dtype": "float32",
356
+ "transformers_version": "4.25.1",
357
+ "type_vocab_size": 1,
358
+ "use_cache": true,
359
+ "vocab_size": 40000
360
+ }
eval_results.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 40.0,
3
+ "eval_accuracy": 0.9791272496102441,
4
+ "eval_f1": 0.9783398772157638,
5
+ "eval_loss": 0.1115424633026123,
6
+ "eval_precision": 0.9782571951013384,
7
+ "eval_recall": 0.978422573307924,
8
+ "eval_runtime": 10.375,
9
+ "eval_samples": 2619,
10
+ "eval_samples_per_second": 758.46,
11
+ "eval_steps_per_second": 2.988
12
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:79795fe75400a2c8af49b37d5dc1a7b63e6c510916a1999033bee8928df0d779
3
+ size 465258545
special_tokens_map.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "cls_token": "<s>",
4
+ "eos_token": "</s>",
5
+ "mask_token": {
6
+ "content": "<mask>",
7
+ "lstrip": true,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false
11
+ },
12
+ "pad_token": "<pad>",
13
+ "sep_token": "</s>",
14
+ "unk_token": "<unk>"
15
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": true,
3
+ "bos_token": {
4
+ "__type": "AddedToken",
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false
10
+ },
11
+ "cls_token": {
12
+ "__type": "AddedToken",
13
+ "content": "<s>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false
18
+ },
19
+ "eos_token": {
20
+ "__type": "AddedToken",
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false
26
+ },
27
+ "errors": "replace",
28
+ "mask_token": {
29
+ "__type": "AddedToken",
30
+ "content": "<mask>",
31
+ "lstrip": true,
32
+ "normalized": true,
33
+ "rstrip": false,
34
+ "single_word": false
35
+ },
36
+ "max_length": 128,
37
+ "model_max_length": 512,
38
+ "name_or_path": "pdelobelle/robbert-v2-dutch-base",
39
+ "pad_token": {
40
+ "__type": "AddedToken",
41
+ "content": "<pad>",
42
+ "lstrip": false,
43
+ "normalized": true,
44
+ "rstrip": false,
45
+ "single_word": false
46
+ },
47
+ "sep_token": {
48
+ "__type": "AddedToken",
49
+ "content": "</s>",
50
+ "lstrip": false,
51
+ "normalized": true,
52
+ "rstrip": false,
53
+ "single_word": false
54
+ },
55
+ "special_tokens_map_file": "./robbert-v2-dutch-base/special_tokens_map.json",
56
+ "token": null,
57
+ "tokenizer_class": "RobertaTokenizer",
58
+ "trim_offsets": true,
59
+ "unk_token": {
60
+ "__type": "AddedToken",
61
+ "content": "<unk>",
62
+ "lstrip": false,
63
+ "normalized": true,
64
+ "rstrip": false,
65
+ "single_word": false
66
+ }
67
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 40.0,
3
+ "train_loss": 0.24823836001796998,
4
+ "train_runtime": 2048.5615,
5
+ "train_samples": 70812,
6
+ "train_samples_per_second": 1382.668,
7
+ "train_steps_per_second": 1.347
8
+ }
trainer_state.json ADDED
@@ -0,0 +1,535 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 0.9783398772157638,
3
+ "best_model_checkpoint": "models/pos_final_mono_nl/checkpoint-2760",
4
+ "epoch": 39.99638989169675,
5
+ "global_step": 2760,
6
+ "is_hyper_param_search": false,
7
+ "is_local_process_zero": true,
8
+ "is_world_process_zero": true,
9
+ "log_history": [
10
+ {
11
+ "epoch": 1.0,
12
+ "eval_accuracy": 0.2575113142718755,
13
+ "eval_f1": 0.16891972475874908,
14
+ "eval_loss": 3.770303726196289,
15
+ "eval_precision": 0.25972286447785947,
16
+ "eval_recall": 0.12516136964406466,
17
+ "eval_runtime": 10.2346,
18
+ "eval_samples_per_second": 768.865,
19
+ "eval_steps_per_second": 3.029,
20
+ "step": 69
21
+ },
22
+ {
23
+ "epoch": 2.0,
24
+ "eval_accuracy": 0.8066356880136831,
25
+ "eval_f1": 0.8033051608452517,
26
+ "eval_loss": 1.0147907733917236,
27
+ "eval_precision": 0.8057864338897609,
28
+ "eval_recall": 0.8008391221491363,
29
+ "eval_runtime": 10.7861,
30
+ "eval_samples_per_second": 729.548,
31
+ "eval_steps_per_second": 2.874,
32
+ "step": 138
33
+ },
34
+ {
35
+ "epoch": 3.0,
36
+ "eval_accuracy": 0.929904490895606,
37
+ "eval_f1": 0.9289770104330163,
38
+ "eval_loss": 0.3402073085308075,
39
+ "eval_precision": 0.9301506840872673,
40
+ "eval_recall": 0.9278062949529723,
41
+ "eval_runtime": 10.421,
42
+ "eval_samples_per_second": 755.108,
43
+ "eval_steps_per_second": 2.975,
44
+ "step": 207
45
+ },
46
+ {
47
+ "epoch": 4.0,
48
+ "eval_accuracy": 0.9560597575188824,
49
+ "eval_f1": 0.9555069516784352,
50
+ "eval_loss": 0.20157238841056824,
51
+ "eval_precision": 0.9559442226785728,
52
+ "eval_recall": 0.9550700805311366,
53
+ "eval_runtime": 10.4533,
54
+ "eval_samples_per_second": 752.775,
55
+ "eval_steps_per_second": 2.966,
56
+ "step": 276
57
+ },
58
+ {
59
+ "epoch": 5.0,
60
+ "eval_accuracy": 0.964831156250473,
61
+ "eval_f1": 0.964063779010111,
62
+ "eval_loss": 0.14858682453632355,
63
+ "eval_precision": 0.9642898327887757,
64
+ "eval_recall": 0.9638378311919837,
65
+ "eval_runtime": 11.0648,
66
+ "eval_samples_per_second": 711.175,
67
+ "eval_steps_per_second": 2.802,
68
+ "step": 345
69
+ },
70
+ {
71
+ "epoch": 6.0,
72
+ "eval_accuracy": 0.9701515128581591,
73
+ "eval_f1": 0.9696606418295832,
74
+ "eval_loss": 0.12055634707212448,
75
+ "eval_precision": 0.9697202582231786,
76
+ "eval_recall": 0.969601032765722,
77
+ "eval_runtime": 10.8285,
78
+ "eval_samples_per_second": 726.696,
79
+ "eval_steps_per_second": 2.863,
80
+ "step": 414
81
+ },
82
+ {
83
+ "epoch": 7.0,
84
+ "eval_accuracy": 0.972701954076922,
85
+ "eval_f1": 0.9719634068091613,
86
+ "eval_loss": 0.10631231963634491,
87
+ "eval_precision": 0.9719820795967141,
88
+ "eval_recall": 0.9719447347390422,
89
+ "eval_runtime": 10.4365,
90
+ "eval_samples_per_second": 753.985,
91
+ "eval_steps_per_second": 2.97,
92
+ "step": 483
93
+ },
94
+ {
95
+ "epoch": 7.25,
96
+ "learning_rate": 5e-05,
97
+ "loss": 1.2192,
98
+ "step": 500
99
+ },
100
+ {
101
+ "epoch": 8.0,
102
+ "eval_accuracy": 0.9742231371183798,
103
+ "eval_f1": 0.9734790710183069,
104
+ "eval_loss": 0.09831023961305618,
105
+ "eval_precision": 0.9734304527887268,
106
+ "eval_recall": 0.973527694104629,
107
+ "eval_runtime": 10.9399,
108
+ "eval_samples_per_second": 719.291,
109
+ "eval_steps_per_second": 2.834,
110
+ "step": 552
111
+ },
112
+ {
113
+ "epoch": 9.0,
114
+ "eval_accuracy": 0.975388620642681,
115
+ "eval_f1": 0.9746202254443189,
116
+ "eval_loss": 0.09469176828861237,
117
+ "eval_precision": 0.974567806377257,
118
+ "eval_recall": 0.9746726501506117,
119
+ "eval_runtime": 10.9888,
120
+ "eval_samples_per_second": 716.095,
121
+ "eval_steps_per_second": 2.821,
122
+ "step": 621
123
+ },
124
+ {
125
+ "epoch": 10.0,
126
+ "eval_accuracy": 0.9761454281259934,
127
+ "eval_f1": 0.9753816837883316,
128
+ "eval_loss": 0.09128155559301376,
129
+ "eval_precision": 0.9752992516787289,
130
+ "eval_recall": 0.975464129833405,
131
+ "eval_runtime": 10.3458,
132
+ "eval_samples_per_second": 760.6,
133
+ "eval_steps_per_second": 2.996,
134
+ "step": 690
135
+ },
136
+ {
137
+ "epoch": 11.0,
138
+ "eval_accuracy": 0.9770081886569695,
139
+ "eval_f1": 0.9762231314470121,
140
+ "eval_loss": 0.08845613151788712,
141
+ "eval_precision": 0.9761368787406173,
142
+ "eval_recall": 0.9763093993975533,
143
+ "eval_runtime": 10.591,
144
+ "eval_samples_per_second": 742.993,
145
+ "eval_steps_per_second": 2.927,
146
+ "step": 759
147
+ },
148
+ {
149
+ "epoch": 12.0,
150
+ "eval_accuracy": 0.9772200947522969,
151
+ "eval_f1": 0.9764375477837924,
152
+ "eval_loss": 0.08773986995220184,
153
+ "eval_precision": 0.9763812802053,
154
+ "eval_recall": 0.976493821847913,
155
+ "eval_runtime": 10.1848,
156
+ "eval_samples_per_second": 772.621,
157
+ "eval_steps_per_second": 3.044,
158
+ "step": 828
159
+ },
160
+ {
161
+ "epoch": 13.0,
162
+ "eval_accuracy": 0.977530385820455,
163
+ "eval_f1": 0.9767615183960107,
164
+ "eval_loss": 0.0878407210111618,
165
+ "eval_precision": 0.9766527100218952,
166
+ "eval_recall": 0.9768703510173972,
167
+ "eval_runtime": 10.414,
168
+ "eval_samples_per_second": 755.619,
169
+ "eval_steps_per_second": 2.977,
170
+ "step": 897
171
+ },
172
+ {
173
+ "epoch": 14.0,
174
+ "eval_accuracy": 0.9775909304191199,
175
+ "eval_f1": 0.9767761714683069,
176
+ "eval_loss": 0.08732089400291443,
177
+ "eval_precision": 0.9766973731723995,
178
+ "eval_recall": 0.9768549824798672,
179
+ "eval_runtime": 10.2517,
180
+ "eval_samples_per_second": 767.579,
181
+ "eval_steps_per_second": 3.024,
182
+ "step": 966
183
+ },
184
+ {
185
+ "epoch": 14.49,
186
+ "learning_rate": 3.893805309734514e-05,
187
+ "loss": 0.0688,
188
+ "step": 1000
189
+ },
190
+ {
191
+ "epoch": 15.0,
192
+ "eval_accuracy": 0.9779466299362768,
193
+ "eval_f1": 0.9771878601613523,
194
+ "eval_loss": 0.08772371709346771,
195
+ "eval_precision": 0.9770827571371501,
196
+ "eval_recall": 0.9772929857994713,
197
+ "eval_runtime": 10.4702,
198
+ "eval_samples_per_second": 751.563,
199
+ "eval_steps_per_second": 2.961,
200
+ "step": 1035
201
+ },
202
+ {
203
+ "epoch": 16.0,
204
+ "eval_accuracy": 0.9781282637322718,
205
+ "eval_f1": 0.9773326264186318,
206
+ "eval_loss": 0.08782745897769928,
207
+ "eval_precision": 0.9772800614675374,
208
+ "eval_recall": 0.9773851970246511,
209
+ "eval_runtime": 10.0217,
210
+ "eval_samples_per_second": 785.196,
211
+ "eval_steps_per_second": 3.093,
212
+ "step": 1104
213
+ },
214
+ {
215
+ "epoch": 17.0,
216
+ "eval_accuracy": 0.9780601510587736,
217
+ "eval_f1": 0.9772529072559422,
218
+ "eval_loss": 0.08969255536794662,
219
+ "eval_precision": 0.9771590568603499,
220
+ "eval_recall": 0.9773467756808262,
221
+ "eval_runtime": 10.2237,
222
+ "eval_samples_per_second": 769.681,
223
+ "eval_steps_per_second": 3.032,
224
+ "step": 1173
225
+ },
226
+ {
227
+ "epoch": 18.0,
228
+ "eval_accuracy": 0.978325033677933,
229
+ "eval_f1": 0.9775826100987749,
230
+ "eval_loss": 0.09088694304227829,
231
+ "eval_precision": 0.9775187663749587,
232
+ "eval_recall": 0.9776464621626606,
233
+ "eval_runtime": 10.2448,
234
+ "eval_samples_per_second": 768.099,
235
+ "eval_steps_per_second": 3.026,
236
+ "step": 1242
237
+ },
238
+ {
239
+ "epoch": 19.0,
240
+ "eval_accuracy": 0.9784536909500962,
241
+ "eval_f1": 0.9776833564477773,
242
+ "eval_loss": 0.09170977026224136,
243
+ "eval_precision": 0.9775819549334296,
244
+ "eval_recall": 0.9777847790004304,
245
+ "eval_runtime": 10.5329,
246
+ "eval_samples_per_second": 747.089,
247
+ "eval_steps_per_second": 2.943,
248
+ "step": 1311
249
+ },
250
+ {
251
+ "epoch": 20.0,
252
+ "eval_accuracy": 0.978695869344756,
253
+ "eval_f1": 0.977932645393286,
254
+ "eval_loss": 0.09237655997276306,
255
+ "eval_precision": 0.977849997695109,
256
+ "eval_recall": 0.9780153070633798,
257
+ "eval_runtime": 10.1755,
258
+ "eval_samples_per_second": 773.331,
259
+ "eval_steps_per_second": 3.047,
260
+ "step": 1380
261
+ },
262
+ {
263
+ "epoch": 21.0,
264
+ "eval_accuracy": 0.9785369397732605,
265
+ "eval_f1": 0.9777796559381027,
266
+ "eval_loss": 0.09489051252603531,
267
+ "eval_precision": 0.9776669790882412,
268
+ "eval_recall": 0.9778923587631401,
269
+ "eval_runtime": 10.3245,
270
+ "eval_samples_per_second": 762.165,
271
+ "eval_steps_per_second": 3.003,
272
+ "step": 1449
273
+ },
274
+ {
275
+ "epoch": 21.74,
276
+ "learning_rate": 2.7876106194690264e-05,
277
+ "loss": 0.0366,
278
+ "step": 1500
279
+ },
280
+ {
281
+ "epoch": 22.0,
282
+ "eval_accuracy": 0.9783931463514312,
283
+ "eval_f1": 0.977655520039341,
284
+ "eval_loss": 0.09559858590364456,
285
+ "eval_precision": 0.9775954268854877,
286
+ "eval_recall": 0.9777156205815455,
287
+ "eval_runtime": 10.2656,
288
+ "eval_samples_per_second": 766.54,
289
+ "eval_steps_per_second": 3.02,
290
+ "step": 1518
291
+ },
292
+ {
293
+ "epoch": 23.0,
294
+ "eval_accuracy": 0.9785899162970924,
295
+ "eval_f1": 0.9778600762968741,
296
+ "eval_loss": 0.0962114930152893,
297
+ "eval_precision": 0.977758656453831,
298
+ "eval_recall": 0.977961517182025,
299
+ "eval_runtime": 10.2847,
300
+ "eval_samples_per_second": 765.116,
301
+ "eval_steps_per_second": 3.014,
302
+ "step": 1587
303
+ },
304
+ {
305
+ "epoch": 24.0,
306
+ "eval_accuracy": 0.978627756671258,
307
+ "eval_f1": 0.9778800497871751,
308
+ "eval_loss": 0.09919747710227966,
309
+ "eval_precision": 0.9777448299173401,
310
+ "eval_recall": 0.9780153070633798,
311
+ "eval_runtime": 10.2497,
312
+ "eval_samples_per_second": 767.73,
313
+ "eval_steps_per_second": 3.024,
314
+ "step": 1656
315
+ },
316
+ {
317
+ "epoch": 25.0,
318
+ "eval_accuracy": 0.9787488458685879,
319
+ "eval_f1": 0.9779865998709163,
320
+ "eval_loss": 0.09993624687194824,
321
+ "eval_precision": 0.9778964351567302,
322
+ "eval_recall": 0.9780767812134997,
323
+ "eval_runtime": 10.4654,
324
+ "eval_samples_per_second": 751.908,
325
+ "eval_steps_per_second": 2.962,
326
+ "step": 1725
327
+ },
328
+ {
329
+ "epoch": 26.0,
330
+ "eval_accuracy": 0.9788699350659179,
331
+ "eval_f1": 0.9781095368349878,
332
+ "eval_loss": 0.10065959393978119,
333
+ "eval_precision": 0.978019360786724,
334
+ "eval_recall": 0.9781997295137395,
335
+ "eval_runtime": 10.3069,
336
+ "eval_samples_per_second": 763.466,
337
+ "eval_steps_per_second": 3.008,
338
+ "step": 1794
339
+ },
340
+ {
341
+ "epoch": 27.0,
342
+ "eval_accuracy": 0.9789229115897498,
343
+ "eval_f1": 0.9781592282543133,
344
+ "eval_loss": 0.10217240452766418,
345
+ "eval_precision": 0.97808031838472,
346
+ "eval_recall": 0.9782381508575644,
347
+ "eval_runtime": 10.3246,
348
+ "eval_samples_per_second": 762.158,
349
+ "eval_steps_per_second": 3.003,
350
+ "step": 1863
351
+ },
352
+ {
353
+ "epoch": 28.0,
354
+ "eval_accuracy": 0.9790061604129142,
355
+ "eval_f1": 0.9782287156594198,
356
+ "eval_loss": 0.10301286727190018,
357
+ "eval_precision": 0.9781347715521547,
358
+ "eval_recall": 0.9783226778139792,
359
+ "eval_runtime": 10.6685,
360
+ "eval_samples_per_second": 737.591,
361
+ "eval_steps_per_second": 2.906,
362
+ "step": 1932
363
+ },
364
+ {
365
+ "epoch": 28.98,
366
+ "learning_rate": 1.6814159292035402e-05,
367
+ "loss": 0.0226,
368
+ "step": 2000
369
+ },
370
+ {
371
+ "epoch": 29.0,
372
+ "eval_accuracy": 0.9789456158142492,
373
+ "eval_f1": 0.9781276533619175,
374
+ "eval_loss": 0.10546565800905228,
375
+ "eval_precision": 0.9780863177791267,
376
+ "eval_recall": 0.9781689924386795,
377
+ "eval_runtime": 10.2106,
378
+ "eval_samples_per_second": 770.668,
379
+ "eval_steps_per_second": 3.036,
380
+ "step": 2001
381
+ },
382
+ {
383
+ "epoch": 30.0,
384
+ "eval_accuracy": 0.9788775031407511,
385
+ "eval_f1": 0.9781016850177108,
386
+ "eval_loss": 0.10569430887699127,
387
+ "eval_precision": 0.9780190230335438,
388
+ "eval_recall": 0.9781843609762095,
389
+ "eval_runtime": 10.1623,
390
+ "eval_samples_per_second": 774.335,
391
+ "eval_steps_per_second": 3.051,
392
+ "step": 2070
393
+ },
394
+ {
395
+ "epoch": 31.0,
396
+ "eval_accuracy": 0.9788169585420861,
397
+ "eval_f1": 0.9780400473314586,
398
+ "eval_loss": 0.10669872909784317,
399
+ "eval_precision": 0.9779649036540766,
400
+ "eval_recall": 0.9781152025573246,
401
+ "eval_runtime": 10.228,
402
+ "eval_samples_per_second": 769.357,
403
+ "eval_steps_per_second": 3.031,
404
+ "step": 2139
405
+ },
406
+ {
407
+ "epoch": 32.0,
408
+ "eval_accuracy": 0.9788850712155842,
409
+ "eval_f1": 0.9781134626983792,
410
+ "eval_loss": 0.10771454125642776,
411
+ "eval_precision": 0.9780195296594217,
412
+ "eval_recall": 0.9782074137825044,
413
+ "eval_runtime": 10.2465,
414
+ "eval_samples_per_second": 767.969,
415
+ "eval_steps_per_second": 3.025,
416
+ "step": 2208
417
+ },
418
+ {
419
+ "epoch": 33.0,
420
+ "eval_accuracy": 0.9788547989162517,
421
+ "eval_f1": 0.9780702765419577,
422
+ "eval_loss": 0.10846679657697678,
423
+ "eval_precision": 0.9780176719170188,
424
+ "eval_recall": 0.9781228868260896,
425
+ "eval_runtime": 10.262,
426
+ "eval_samples_per_second": 766.809,
427
+ "eval_steps_per_second": 3.021,
428
+ "step": 2277
429
+ },
430
+ {
431
+ "epoch": 34.0,
432
+ "eval_accuracy": 0.9789153435149167,
433
+ "eval_f1": 0.9781402710760058,
434
+ "eval_loss": 0.10942833125591278,
435
+ "eval_precision": 0.9780500921942225,
436
+ "eval_recall": 0.9782304665887994,
437
+ "eval_runtime": 10.2289,
438
+ "eval_samples_per_second": 769.29,
439
+ "eval_steps_per_second": 3.031,
440
+ "step": 2346
441
+ },
442
+ {
443
+ "epoch": 35.0,
444
+ "eval_accuracy": 0.9791272496102441,
445
+ "eval_f1": 0.9783318606170041,
446
+ "eval_loss": 0.10954407602548599,
447
+ "eval_precision": 0.9782642100895862,
448
+ "eval_recall": 0.978399520501629,
449
+ "eval_runtime": 10.2671,
450
+ "eval_samples_per_second": 766.432,
451
+ "eval_steps_per_second": 3.019,
452
+ "step": 2415
453
+ },
454
+ {
455
+ "epoch": 36.0,
456
+ "eval_accuracy": 0.9788775031407511,
457
+ "eval_f1": 0.9780747081173908,
458
+ "eval_loss": 0.11010610312223434,
459
+ "eval_precision": 0.9779958050661893,
460
+ "eval_recall": 0.9781536239011496,
461
+ "eval_runtime": 10.1692,
462
+ "eval_samples_per_second": 773.807,
463
+ "eval_steps_per_second": 3.048,
464
+ "step": 2484
465
+ },
466
+ {
467
+ "epoch": 36.23,
468
+ "learning_rate": 5.752212389380531e-06,
469
+ "loss": 0.0159,
470
+ "step": 2500
471
+ },
472
+ {
473
+ "epoch": 37.0,
474
+ "eval_accuracy": 0.9791045453857448,
475
+ "eval_f1": 0.9783088094048946,
476
+ "eval_loss": 0.11143232136964798,
477
+ "eval_precision": 0.9782411604714415,
478
+ "eval_recall": 0.9783764676953342,
479
+ "eval_runtime": 10.184,
480
+ "eval_samples_per_second": 772.684,
481
+ "eval_steps_per_second": 3.044,
482
+ "step": 2553
483
+ },
484
+ {
485
+ "epoch": 38.0,
486
+ "eval_accuracy": 0.9791272496102441,
487
+ "eval_f1": 0.9783286015589884,
488
+ "eval_loss": 0.11111290007829666,
489
+ "eval_precision": 0.9782346478591898,
490
+ "eval_recall": 0.978422573307924,
491
+ "eval_runtime": 10.011,
492
+ "eval_samples_per_second": 786.035,
493
+ "eval_steps_per_second": 3.097,
494
+ "step": 2622
495
+ },
496
+ {
497
+ "epoch": 39.0,
498
+ "eval_accuracy": 0.9790894092360786,
499
+ "eval_f1": 0.9782975339329141,
500
+ "eval_loss": 0.11137838661670685,
501
+ "eval_precision": 0.978218612905952,
502
+ "eval_recall": 0.9783764676953342,
503
+ "eval_runtime": 10.1814,
504
+ "eval_samples_per_second": 772.882,
505
+ "eval_steps_per_second": 3.045,
506
+ "step": 2691
507
+ },
508
+ {
509
+ "epoch": 40.0,
510
+ "eval_accuracy": 0.9791272496102441,
511
+ "eval_f1": 0.9783398772157638,
512
+ "eval_loss": 0.1115424633026123,
513
+ "eval_precision": 0.9782571951013384,
514
+ "eval_recall": 0.978422573307924,
515
+ "eval_runtime": 10.2449,
516
+ "eval_samples_per_second": 768.087,
517
+ "eval_steps_per_second": 3.026,
518
+ "step": 2760
519
+ },
520
+ {
521
+ "epoch": 40.0,
522
+ "step": 2760,
523
+ "total_flos": 1.3845487354146643e+17,
524
+ "train_loss": 0.24823836001796998,
525
+ "train_runtime": 2048.5615,
526
+ "train_samples_per_second": 1382.668,
527
+ "train_steps_per_second": 1.347
528
+ }
529
+ ],
530
+ "max_steps": 2760,
531
+ "num_train_epochs": 40,
532
+ "total_flos": 1.3845487354146643e+17,
533
+ "trial_name": null,
534
+ "trial_params": null
535
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3bf4de53a5ef20802f03f57efada97b2e49b1138c6761b6e66f5b4ae2aed8958
3
+ size 3439
vocab.json ADDED
The diff for this file is too large to render. See raw diff