2023-10-16 22:55:24,117 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,118 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=13, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-16 22:55:24,118 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,118 MultiCorpus: 6183 train + 680 dev + 2113 test sentences - NER_HIPE_2022 Corpus: 6183 train + 680 dev + 2113 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/topres19th/en/with_doc_seperator 2023-10-16 22:55:24,118 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,118 Train: 6183 sentences 2023-10-16 22:55:24,118 (train_with_dev=False, train_with_test=False) 2023-10-16 22:55:24,118 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,118 Training Params: 2023-10-16 22:55:24,118 - learning_rate: "3e-05" 2023-10-16 22:55:24,118 - mini_batch_size: "4" 2023-10-16 22:55:24,119 - max_epochs: "10" 2023-10-16 22:55:24,119 - shuffle: "True" 2023-10-16 22:55:24,119 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,119 Plugins: 2023-10-16 22:55:24,119 - LinearScheduler | warmup_fraction: '0.1' 2023-10-16 22:55:24,119 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,119 Final evaluation on model from best epoch (best-model.pt) 2023-10-16 22:55:24,119 - metric: "('micro avg', 'f1-score')" 2023-10-16 22:55:24,119 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,119 Computation: 2023-10-16 22:55:24,119 - compute on device: cuda:0 2023-10-16 22:55:24,119 - embedding storage: none 2023-10-16 22:55:24,119 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,119 Model training base path: "hmbench-topres19th/en-dbmdz/bert-base-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4" 2023-10-16 22:55:24,119 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:24,119 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:55:30,952 epoch 1 - iter 154/1546 - loss 2.02538090 - time (sec): 6.83 - samples/sec: 1748.19 - lr: 0.000003 - momentum: 0.000000 2023-10-16 22:55:37,944 epoch 1 - iter 308/1546 - loss 1.10084678 - time (sec): 13.82 - samples/sec: 1753.65 - lr: 0.000006 - momentum: 0.000000 2023-10-16 22:55:45,034 epoch 1 - iter 462/1546 - loss 0.78647526 - time (sec): 20.91 - samples/sec: 1743.25 - lr: 0.000009 - momentum: 0.000000 2023-10-16 22:55:51,980 epoch 1 - iter 616/1546 - loss 0.62446099 - time (sec): 27.86 - samples/sec: 1741.14 - lr: 0.000012 - momentum: 0.000000 2023-10-16 22:55:58,942 epoch 1 - iter 770/1546 - loss 0.52442561 - time (sec): 34.82 - samples/sec: 1740.59 - lr: 0.000015 - momentum: 0.000000 2023-10-16 22:56:06,056 epoch 1 - iter 924/1546 - loss 0.45595178 - time (sec): 41.94 - samples/sec: 1747.95 - lr: 0.000018 - momentum: 0.000000 2023-10-16 22:56:12,979 epoch 1 - iter 1078/1546 - loss 0.40906220 - time (sec): 48.86 - samples/sec: 1748.23 - lr: 0.000021 - momentum: 0.000000 2023-10-16 22:56:19,931 epoch 1 - iter 1232/1546 - loss 0.36522487 - time (sec): 55.81 - samples/sec: 1782.15 - lr: 0.000024 - momentum: 0.000000 2023-10-16 22:56:26,757 epoch 1 - iter 1386/1546 - loss 0.33735166 - time (sec): 62.64 - samples/sec: 1778.01 - lr: 0.000027 - momentum: 0.000000 2023-10-16 22:56:33,780 epoch 1 - iter 1540/1546 - loss 0.31510388 - time (sec): 69.66 - samples/sec: 1778.34 - lr: 0.000030 - momentum: 0.000000 2023-10-16 22:56:34,038 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:56:34,038 EPOCH 1 done: loss 0.3141 - lr: 0.000030 2023-10-16 22:56:35,849 DEV : loss 0.09308421611785889 - f1-score (micro avg) 0.6875 2023-10-16 22:56:35,862 saving best model 2023-10-16 22:56:36,248 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:56:43,180 epoch 2 - iter 154/1546 - loss 0.08398905 - time (sec): 6.93 - samples/sec: 1878.88 - lr: 0.000030 - momentum: 0.000000 2023-10-16 22:56:50,219 epoch 2 - iter 308/1546 - loss 0.09542073 - time (sec): 13.97 - samples/sec: 1876.03 - lr: 0.000029 - momentum: 0.000000 2023-10-16 22:56:57,201 epoch 2 - iter 462/1546 - loss 0.08471087 - time (sec): 20.95 - samples/sec: 1892.41 - lr: 0.000029 - momentum: 0.000000 2023-10-16 22:57:04,205 epoch 2 - iter 616/1546 - loss 0.08071299 - time (sec): 27.95 - samples/sec: 1880.42 - lr: 0.000029 - momentum: 0.000000 2023-10-16 22:57:11,180 epoch 2 - iter 770/1546 - loss 0.08054955 - time (sec): 34.93 - samples/sec: 1852.22 - lr: 0.000028 - momentum: 0.000000 2023-10-16 22:57:18,042 epoch 2 - iter 924/1546 - loss 0.08042291 - time (sec): 41.79 - samples/sec: 1822.50 - lr: 0.000028 - momentum: 0.000000 2023-10-16 22:57:24,993 epoch 2 - iter 1078/1546 - loss 0.07999482 - time (sec): 48.74 - samples/sec: 1795.97 - lr: 0.000028 - momentum: 0.000000 2023-10-16 22:57:31,929 epoch 2 - iter 1232/1546 - loss 0.08098848 - time (sec): 55.68 - samples/sec: 1795.29 - lr: 0.000027 - momentum: 0.000000 2023-10-16 22:57:38,721 epoch 2 - iter 1386/1546 - loss 0.08104893 - time (sec): 62.47 - samples/sec: 1792.03 - lr: 0.000027 - momentum: 0.000000 2023-10-16 22:57:45,660 epoch 2 - iter 1540/1546 - loss 0.08159740 - time (sec): 69.41 - samples/sec: 1785.43 - lr: 0.000027 - momentum: 0.000000 2023-10-16 22:57:45,929 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:57:45,930 EPOCH 2 done: loss 0.0815 - lr: 0.000027 2023-10-16 22:57:48,443 DEV : loss 0.07893380522727966 - f1-score (micro avg) 0.7179 2023-10-16 22:57:48,456 saving best model 2023-10-16 22:57:48,916 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:57:55,989 epoch 3 - iter 154/1546 - loss 0.05779724 - time (sec): 7.07 - samples/sec: 1866.59 - lr: 0.000026 - momentum: 0.000000 2023-10-16 22:58:02,957 epoch 3 - iter 308/1546 - loss 0.05329053 - time (sec): 14.04 - samples/sec: 1877.95 - lr: 0.000026 - momentum: 0.000000 2023-10-16 22:58:09,919 epoch 3 - iter 462/1546 - loss 0.05290610 - time (sec): 21.00 - samples/sec: 1827.68 - lr: 0.000026 - momentum: 0.000000 2023-10-16 22:58:16,734 epoch 3 - iter 616/1546 - loss 0.05251651 - time (sec): 27.82 - samples/sec: 1852.75 - lr: 0.000025 - momentum: 0.000000 2023-10-16 22:58:23,802 epoch 3 - iter 770/1546 - loss 0.05620252 - time (sec): 34.88 - samples/sec: 1830.94 - lr: 0.000025 - momentum: 0.000000 2023-10-16 22:58:30,728 epoch 3 - iter 924/1546 - loss 0.05327156 - time (sec): 41.81 - samples/sec: 1810.74 - lr: 0.000025 - momentum: 0.000000 2023-10-16 22:58:37,749 epoch 3 - iter 1078/1546 - loss 0.05179250 - time (sec): 48.83 - samples/sec: 1801.70 - lr: 0.000024 - momentum: 0.000000 2023-10-16 22:58:44,683 epoch 3 - iter 1232/1546 - loss 0.05111602 - time (sec): 55.76 - samples/sec: 1787.00 - lr: 0.000024 - momentum: 0.000000 2023-10-16 22:58:51,659 epoch 3 - iter 1386/1546 - loss 0.05503167 - time (sec): 62.74 - samples/sec: 1784.02 - lr: 0.000024 - momentum: 0.000000 2023-10-16 22:58:58,674 epoch 3 - iter 1540/1546 - loss 0.05356182 - time (sec): 69.76 - samples/sec: 1775.63 - lr: 0.000023 - momentum: 0.000000 2023-10-16 22:58:58,969 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:58:58,969 EPOCH 3 done: loss 0.0534 - lr: 0.000023 2023-10-16 22:59:01,104 DEV : loss 0.07748623192310333 - f1-score (micro avg) 0.7722 2023-10-16 22:59:01,123 saving best model 2023-10-16 22:59:01,558 ---------------------------------------------------------------------------------------------------- 2023-10-16 22:59:08,656 epoch 4 - iter 154/1546 - loss 0.02829162 - time (sec): 7.10 - samples/sec: 1872.64 - lr: 0.000023 - momentum: 0.000000 2023-10-16 22:59:15,863 epoch 4 - iter 308/1546 - loss 0.03022453 - time (sec): 14.30 - samples/sec: 1765.31 - lr: 0.000023 - momentum: 0.000000 2023-10-16 22:59:23,032 epoch 4 - iter 462/1546 - loss 0.03061131 - time (sec): 21.47 - samples/sec: 1784.48 - lr: 0.000022 - momentum: 0.000000 2023-10-16 22:59:29,960 epoch 4 - iter 616/1546 - loss 0.03200286 - time (sec): 28.40 - samples/sec: 1768.39 - lr: 0.000022 - momentum: 0.000000 2023-10-16 22:59:36,962 epoch 4 - iter 770/1546 - loss 0.03054981 - time (sec): 35.40 - samples/sec: 1754.04 - lr: 0.000022 - momentum: 0.000000 2023-10-16 22:59:44,121 epoch 4 - iter 924/1546 - loss 0.03188868 - time (sec): 42.56 - samples/sec: 1770.99 - lr: 0.000021 - momentum: 0.000000 2023-10-16 22:59:51,180 epoch 4 - iter 1078/1546 - loss 0.03290334 - time (sec): 49.62 - samples/sec: 1762.82 - lr: 0.000021 - momentum: 0.000000 2023-10-16 22:59:58,373 epoch 4 - iter 1232/1546 - loss 0.03474654 - time (sec): 56.81 - samples/sec: 1764.27 - lr: 0.000021 - momentum: 0.000000 2023-10-16 23:00:05,304 epoch 4 - iter 1386/1546 - loss 0.03409150 - time (sec): 63.74 - samples/sec: 1761.62 - lr: 0.000020 - momentum: 0.000000 2023-10-16 23:00:12,109 epoch 4 - iter 1540/1546 - loss 0.03479952 - time (sec): 70.55 - samples/sec: 1757.31 - lr: 0.000020 - momentum: 0.000000 2023-10-16 23:00:12,360 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:00:12,360 EPOCH 4 done: loss 0.0349 - lr: 0.000020 2023-10-16 23:00:14,474 DEV : loss 0.096384696662426 - f1-score (micro avg) 0.7439 2023-10-16 23:00:14,487 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:00:21,420 epoch 5 - iter 154/1546 - loss 0.02586135 - time (sec): 6.93 - samples/sec: 1708.87 - lr: 0.000020 - momentum: 0.000000 2023-10-16 23:00:28,521 epoch 5 - iter 308/1546 - loss 0.02697778 - time (sec): 14.03 - samples/sec: 1731.12 - lr: 0.000019 - momentum: 0.000000 2023-10-16 23:00:35,515 epoch 5 - iter 462/1546 - loss 0.02633540 - time (sec): 21.03 - samples/sec: 1744.38 - lr: 0.000019 - momentum: 0.000000 2023-10-16 23:00:42,412 epoch 5 - iter 616/1546 - loss 0.02506817 - time (sec): 27.92 - samples/sec: 1769.86 - lr: 0.000019 - momentum: 0.000000 2023-10-16 23:00:49,241 epoch 5 - iter 770/1546 - loss 0.02333240 - time (sec): 34.75 - samples/sec: 1771.95 - lr: 0.000018 - momentum: 0.000000 2023-10-16 23:00:56,162 epoch 5 - iter 924/1546 - loss 0.02299940 - time (sec): 41.67 - samples/sec: 1782.02 - lr: 0.000018 - momentum: 0.000000 2023-10-16 23:01:03,068 epoch 5 - iter 1078/1546 - loss 0.02391515 - time (sec): 48.58 - samples/sec: 1787.02 - lr: 0.000018 - momentum: 0.000000 2023-10-16 23:01:09,932 epoch 5 - iter 1232/1546 - loss 0.02530811 - time (sec): 55.44 - samples/sec: 1782.95 - lr: 0.000017 - momentum: 0.000000 2023-10-16 23:01:16,872 epoch 5 - iter 1386/1546 - loss 0.02533599 - time (sec): 62.38 - samples/sec: 1780.89 - lr: 0.000017 - momentum: 0.000000 2023-10-16 23:01:23,709 epoch 5 - iter 1540/1546 - loss 0.02481519 - time (sec): 69.22 - samples/sec: 1791.49 - lr: 0.000017 - momentum: 0.000000 2023-10-16 23:01:23,966 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:01:23,966 EPOCH 5 done: loss 0.0248 - lr: 0.000017 2023-10-16 23:01:26,008 DEV : loss 0.09776511788368225 - f1-score (micro avg) 0.7849 2023-10-16 23:01:26,022 saving best model 2023-10-16 23:01:26,478 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:01:33,699 epoch 6 - iter 154/1546 - loss 0.01891335 - time (sec): 7.22 - samples/sec: 1656.83 - lr: 0.000016 - momentum: 0.000000 2023-10-16 23:01:40,520 epoch 6 - iter 308/1546 - loss 0.01505548 - time (sec): 14.04 - samples/sec: 1735.36 - lr: 0.000016 - momentum: 0.000000 2023-10-16 23:01:47,391 epoch 6 - iter 462/1546 - loss 0.01687048 - time (sec): 20.91 - samples/sec: 1770.14 - lr: 0.000016 - momentum: 0.000000 2023-10-16 23:01:54,201 epoch 6 - iter 616/1546 - loss 0.01676100 - time (sec): 27.72 - samples/sec: 1795.31 - lr: 0.000015 - momentum: 0.000000 2023-10-16 23:02:01,043 epoch 6 - iter 770/1546 - loss 0.01743799 - time (sec): 34.56 - samples/sec: 1783.52 - lr: 0.000015 - momentum: 0.000000 2023-10-16 23:02:07,867 epoch 6 - iter 924/1546 - loss 0.01719557 - time (sec): 41.39 - samples/sec: 1786.16 - lr: 0.000015 - momentum: 0.000000 2023-10-16 23:02:14,832 epoch 6 - iter 1078/1546 - loss 0.01686034 - time (sec): 48.35 - samples/sec: 1810.74 - lr: 0.000014 - momentum: 0.000000 2023-10-16 23:02:21,723 epoch 6 - iter 1232/1546 - loss 0.01742908 - time (sec): 55.24 - samples/sec: 1811.68 - lr: 0.000014 - momentum: 0.000000 2023-10-16 23:02:28,514 epoch 6 - iter 1386/1546 - loss 0.01704174 - time (sec): 62.03 - samples/sec: 1803.99 - lr: 0.000014 - momentum: 0.000000 2023-10-16 23:02:35,311 epoch 6 - iter 1540/1546 - loss 0.01782016 - time (sec): 68.83 - samples/sec: 1797.96 - lr: 0.000013 - momentum: 0.000000 2023-10-16 23:02:35,582 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:02:35,582 EPOCH 6 done: loss 0.0177 - lr: 0.000013 2023-10-16 23:02:37,681 DEV : loss 0.10087885707616806 - f1-score (micro avg) 0.7967 2023-10-16 23:02:37,694 saving best model 2023-10-16 23:02:38,137 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:02:45,059 epoch 7 - iter 154/1546 - loss 0.00882572 - time (sec): 6.92 - samples/sec: 1801.10 - lr: 0.000013 - momentum: 0.000000 2023-10-16 23:02:51,970 epoch 7 - iter 308/1546 - loss 0.00959269 - time (sec): 13.83 - samples/sec: 1837.87 - lr: 0.000013 - momentum: 0.000000 2023-10-16 23:02:58,728 epoch 7 - iter 462/1546 - loss 0.01074153 - time (sec): 20.59 - samples/sec: 1826.45 - lr: 0.000012 - momentum: 0.000000 2023-10-16 23:03:05,614 epoch 7 - iter 616/1546 - loss 0.01103381 - time (sec): 27.48 - samples/sec: 1815.78 - lr: 0.000012 - momentum: 0.000000 2023-10-16 23:03:12,330 epoch 7 - iter 770/1546 - loss 0.01089398 - time (sec): 34.19 - samples/sec: 1795.14 - lr: 0.000012 - momentum: 0.000000 2023-10-16 23:03:19,062 epoch 7 - iter 924/1546 - loss 0.01051405 - time (sec): 40.92 - samples/sec: 1791.40 - lr: 0.000011 - momentum: 0.000000 2023-10-16 23:03:25,860 epoch 7 - iter 1078/1546 - loss 0.01078117 - time (sec): 47.72 - samples/sec: 1789.95 - lr: 0.000011 - momentum: 0.000000 2023-10-16 23:03:32,683 epoch 7 - iter 1232/1546 - loss 0.01046656 - time (sec): 54.55 - samples/sec: 1783.28 - lr: 0.000011 - momentum: 0.000000 2023-10-16 23:03:39,483 epoch 7 - iter 1386/1546 - loss 0.01062877 - time (sec): 61.34 - samples/sec: 1781.78 - lr: 0.000010 - momentum: 0.000000 2023-10-16 23:03:46,278 epoch 7 - iter 1540/1546 - loss 0.01097604 - time (sec): 68.14 - samples/sec: 1816.28 - lr: 0.000010 - momentum: 0.000000 2023-10-16 23:03:46,535 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:03:46,535 EPOCH 7 done: loss 0.0109 - lr: 0.000010 2023-10-16 23:03:48,934 DEV : loss 0.1023801639676094 - f1-score (micro avg) 0.7807 2023-10-16 23:03:48,947 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:03:55,722 epoch 8 - iter 154/1546 - loss 0.00276108 - time (sec): 6.77 - samples/sec: 1772.81 - lr: 0.000010 - momentum: 0.000000 2023-10-16 23:04:02,248 epoch 8 - iter 308/1546 - loss 0.00679671 - time (sec): 13.30 - samples/sec: 1827.75 - lr: 0.000009 - momentum: 0.000000 2023-10-16 23:04:08,788 epoch 8 - iter 462/1546 - loss 0.00528715 - time (sec): 19.84 - samples/sec: 1852.41 - lr: 0.000009 - momentum: 0.000000 2023-10-16 23:04:15,408 epoch 8 - iter 616/1546 - loss 0.00664770 - time (sec): 26.46 - samples/sec: 1867.03 - lr: 0.000009 - momentum: 0.000000 2023-10-16 23:04:22,270 epoch 8 - iter 770/1546 - loss 0.00694105 - time (sec): 33.32 - samples/sec: 1846.72 - lr: 0.000008 - momentum: 0.000000 2023-10-16 23:04:29,131 epoch 8 - iter 924/1546 - loss 0.00772621 - time (sec): 40.18 - samples/sec: 1837.91 - lr: 0.000008 - momentum: 0.000000 2023-10-16 23:04:35,982 epoch 8 - iter 1078/1546 - loss 0.00795917 - time (sec): 47.03 - samples/sec: 1842.25 - lr: 0.000008 - momentum: 0.000000 2023-10-16 23:04:42,806 epoch 8 - iter 1232/1546 - loss 0.00817950 - time (sec): 53.86 - samples/sec: 1845.29 - lr: 0.000007 - momentum: 0.000000 2023-10-16 23:04:49,669 epoch 8 - iter 1386/1546 - loss 0.00800634 - time (sec): 60.72 - samples/sec: 1838.42 - lr: 0.000007 - momentum: 0.000000 2023-10-16 23:04:56,447 epoch 8 - iter 1540/1546 - loss 0.00792214 - time (sec): 67.50 - samples/sec: 1834.35 - lr: 0.000007 - momentum: 0.000000 2023-10-16 23:04:56,713 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:04:56,713 EPOCH 8 done: loss 0.0079 - lr: 0.000007 2023-10-16 23:04:58,765 DEV : loss 0.1137828379869461 - f1-score (micro avg) 0.7879 2023-10-16 23:04:58,779 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:05:05,640 epoch 9 - iter 154/1546 - loss 0.00625578 - time (sec): 6.86 - samples/sec: 1822.09 - lr: 0.000006 - momentum: 0.000000 2023-10-16 23:05:12,362 epoch 9 - iter 308/1546 - loss 0.00569991 - time (sec): 13.58 - samples/sec: 1786.38 - lr: 0.000006 - momentum: 0.000000 2023-10-16 23:05:19,230 epoch 9 - iter 462/1546 - loss 0.00580179 - time (sec): 20.45 - samples/sec: 1833.99 - lr: 0.000006 - momentum: 0.000000 2023-10-16 23:05:25,993 epoch 9 - iter 616/1546 - loss 0.00507301 - time (sec): 27.21 - samples/sec: 1813.16 - lr: 0.000005 - momentum: 0.000000 2023-10-16 23:05:32,807 epoch 9 - iter 770/1546 - loss 0.00469977 - time (sec): 34.03 - samples/sec: 1799.76 - lr: 0.000005 - momentum: 0.000000 2023-10-16 23:05:39,691 epoch 9 - iter 924/1546 - loss 0.00605868 - time (sec): 40.91 - samples/sec: 1821.49 - lr: 0.000005 - momentum: 0.000000 2023-10-16 23:05:46,528 epoch 9 - iter 1078/1546 - loss 0.00553326 - time (sec): 47.75 - samples/sec: 1821.57 - lr: 0.000004 - momentum: 0.000000 2023-10-16 23:05:53,539 epoch 9 - iter 1232/1546 - loss 0.00506175 - time (sec): 54.76 - samples/sec: 1815.38 - lr: 0.000004 - momentum: 0.000000 2023-10-16 23:06:00,472 epoch 9 - iter 1386/1546 - loss 0.00474763 - time (sec): 61.69 - samples/sec: 1817.48 - lr: 0.000004 - momentum: 0.000000 2023-10-16 23:06:07,322 epoch 9 - iter 1540/1546 - loss 0.00470313 - time (sec): 68.54 - samples/sec: 1802.45 - lr: 0.000003 - momentum: 0.000000 2023-10-16 23:06:07,604 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:06:07,604 EPOCH 9 done: loss 0.0047 - lr: 0.000003 2023-10-16 23:06:09,632 DEV : loss 0.11162678897380829 - f1-score (micro avg) 0.8065 2023-10-16 23:06:09,645 saving best model 2023-10-16 23:06:10,098 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:06:16,992 epoch 10 - iter 154/1546 - loss 0.00029590 - time (sec): 6.89 - samples/sec: 1868.58 - lr: 0.000003 - momentum: 0.000000 2023-10-16 23:06:23,924 epoch 10 - iter 308/1546 - loss 0.00287854 - time (sec): 13.82 - samples/sec: 1847.28 - lr: 0.000003 - momentum: 0.000000 2023-10-16 23:06:30,870 epoch 10 - iter 462/1546 - loss 0.00262317 - time (sec): 20.77 - samples/sec: 1847.50 - lr: 0.000002 - momentum: 0.000000 2023-10-16 23:06:37,720 epoch 10 - iter 616/1546 - loss 0.00288842 - time (sec): 27.62 - samples/sec: 1830.00 - lr: 0.000002 - momentum: 0.000000 2023-10-16 23:06:44,747 epoch 10 - iter 770/1546 - loss 0.00280402 - time (sec): 34.65 - samples/sec: 1827.44 - lr: 0.000002 - momentum: 0.000000 2023-10-16 23:06:51,575 epoch 10 - iter 924/1546 - loss 0.00318869 - time (sec): 41.48 - samples/sec: 1806.38 - lr: 0.000001 - momentum: 0.000000 2023-10-16 23:06:58,391 epoch 10 - iter 1078/1546 - loss 0.00298211 - time (sec): 48.29 - samples/sec: 1804.58 - lr: 0.000001 - momentum: 0.000000 2023-10-16 23:07:05,298 epoch 10 - iter 1232/1546 - loss 0.00271568 - time (sec): 55.20 - samples/sec: 1808.08 - lr: 0.000001 - momentum: 0.000000 2023-10-16 23:07:12,158 epoch 10 - iter 1386/1546 - loss 0.00309708 - time (sec): 62.06 - samples/sec: 1794.54 - lr: 0.000000 - momentum: 0.000000 2023-10-16 23:07:19,086 epoch 10 - iter 1540/1546 - loss 0.00330551 - time (sec): 68.99 - samples/sec: 1793.54 - lr: 0.000000 - momentum: 0.000000 2023-10-16 23:07:19,363 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:07:19,363 EPOCH 10 done: loss 0.0033 - lr: 0.000000 2023-10-16 23:07:21,467 DEV : loss 0.11447464674711227 - f1-score (micro avg) 0.8 2023-10-16 23:07:21,847 ---------------------------------------------------------------------------------------------------- 2023-10-16 23:07:21,848 Loading model from best epoch ... 2023-10-16 23:07:23,353 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-BUILDING, B-BUILDING, E-BUILDING, I-BUILDING, S-STREET, B-STREET, E-STREET, I-STREET 2023-10-16 23:07:29,624 Results: - F-score (micro) 0.8094 - F-score (macro) 0.7264 - Accuracy 0.7028 By class: precision recall f1-score support LOC 0.8396 0.8742 0.8566 946 BUILDING 0.5812 0.6000 0.5904 185 STREET 0.7321 0.7321 0.7321 56 micro avg 0.7946 0.8248 0.8094 1187 macro avg 0.7176 0.7355 0.7264 1187 weighted avg 0.7942 0.8248 0.8092 1187 2023-10-16 23:07:29,625 ----------------------------------------------------------------------------------------------------