|
2023-10-25 18:37:38,397 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,398 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(64001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=17, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-25 18:37:38,398 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,398 MultiCorpus: 7142 train + 698 dev + 2570 test sentences |
|
- NER_HIPE_2022 Corpus: 7142 train + 698 dev + 2570 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/fr/with_doc_seperator |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Train: 7142 sentences |
|
2023-10-25 18:37:38,399 (train_with_dev=False, train_with_test=False) |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Training Params: |
|
2023-10-25 18:37:38,399 - learning_rate: "5e-05" |
|
2023-10-25 18:37:38,399 - mini_batch_size: "4" |
|
2023-10-25 18:37:38,399 - max_epochs: "10" |
|
2023-10-25 18:37:38,399 - shuffle: "True" |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Plugins: |
|
2023-10-25 18:37:38,399 - TensorboardLogger |
|
2023-10-25 18:37:38,399 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-25 18:37:38,399 - metric: "('micro avg', 'f1-score')" |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Computation: |
|
2023-10-25 18:37:38,399 - compute on device: cuda:0 |
|
2023-10-25 18:37:38,399 - embedding storage: none |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Model training base path: "hmbench-newseye/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5" |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:37:38,399 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-25 18:37:47,542 epoch 1 - iter 178/1786 - loss 1.40453799 - time (sec): 9.14 - samples/sec: 2496.26 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-25 18:37:56,903 epoch 1 - iter 356/1786 - loss 0.90807035 - time (sec): 18.50 - samples/sec: 2525.74 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-25 18:38:06,238 epoch 1 - iter 534/1786 - loss 0.69820556 - time (sec): 27.84 - samples/sec: 2559.06 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-25 18:38:15,506 epoch 1 - iter 712/1786 - loss 0.56410304 - time (sec): 37.11 - samples/sec: 2650.91 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-25 18:38:24,825 epoch 1 - iter 890/1786 - loss 0.48580068 - time (sec): 46.42 - samples/sec: 2644.52 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-25 18:38:34,115 epoch 1 - iter 1068/1786 - loss 0.43280563 - time (sec): 55.71 - samples/sec: 2626.42 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-25 18:38:43,163 epoch 1 - iter 1246/1786 - loss 0.39270958 - time (sec): 64.76 - samples/sec: 2638.36 - lr: 0.000035 - momentum: 0.000000 |
|
2023-10-25 18:38:52,184 epoch 1 - iter 1424/1786 - loss 0.36098717 - time (sec): 73.78 - samples/sec: 2661.58 - lr: 0.000040 - momentum: 0.000000 |
|
2023-10-25 18:39:01,242 epoch 1 - iter 1602/1786 - loss 0.33608692 - time (sec): 82.84 - samples/sec: 2688.79 - lr: 0.000045 - momentum: 0.000000 |
|
2023-10-25 18:39:10,573 epoch 1 - iter 1780/1786 - loss 0.31930380 - time (sec): 92.17 - samples/sec: 2689.56 - lr: 0.000050 - momentum: 0.000000 |
|
2023-10-25 18:39:10,863 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:39:10,864 EPOCH 1 done: loss 0.3186 - lr: 0.000050 |
|
2023-10-25 18:39:15,004 DEV : loss 0.10816145688295364 - f1-score (micro avg) 0.7071 |
|
2023-10-25 18:39:15,026 saving best model |
|
2023-10-25 18:39:15,535 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:39:25,230 epoch 2 - iter 178/1786 - loss 0.11968307 - time (sec): 9.69 - samples/sec: 2666.97 - lr: 0.000049 - momentum: 0.000000 |
|
2023-10-25 18:39:34,400 epoch 2 - iter 356/1786 - loss 0.12356003 - time (sec): 18.86 - samples/sec: 2568.18 - lr: 0.000049 - momentum: 0.000000 |
|
2023-10-25 18:39:43,423 epoch 2 - iter 534/1786 - loss 0.12183286 - time (sec): 27.89 - samples/sec: 2629.03 - lr: 0.000048 - momentum: 0.000000 |
|
2023-10-25 18:39:52,372 epoch 2 - iter 712/1786 - loss 0.12132289 - time (sec): 36.83 - samples/sec: 2689.32 - lr: 0.000048 - momentum: 0.000000 |
|
2023-10-25 18:40:01,037 epoch 2 - iter 890/1786 - loss 0.12094300 - time (sec): 45.50 - samples/sec: 2683.34 - lr: 0.000047 - momentum: 0.000000 |
|
2023-10-25 18:40:09,856 epoch 2 - iter 1068/1786 - loss 0.12136206 - time (sec): 54.32 - samples/sec: 2703.63 - lr: 0.000047 - momentum: 0.000000 |
|
2023-10-25 18:40:18,988 epoch 2 - iter 1246/1786 - loss 0.12150787 - time (sec): 63.45 - samples/sec: 2703.34 - lr: 0.000046 - momentum: 0.000000 |
|
2023-10-25 18:40:28,126 epoch 2 - iter 1424/1786 - loss 0.12125528 - time (sec): 72.59 - samples/sec: 2734.94 - lr: 0.000046 - momentum: 0.000000 |
|
2023-10-25 18:40:37,434 epoch 2 - iter 1602/1786 - loss 0.12087874 - time (sec): 81.90 - samples/sec: 2723.53 - lr: 0.000045 - momentum: 0.000000 |
|
2023-10-25 18:40:46,385 epoch 2 - iter 1780/1786 - loss 0.12089949 - time (sec): 90.85 - samples/sec: 2730.30 - lr: 0.000044 - momentum: 0.000000 |
|
2023-10-25 18:40:46,664 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:40:46,665 EPOCH 2 done: loss 0.1209 - lr: 0.000044 |
|
2023-10-25 18:40:50,825 DEV : loss 0.11196932196617126 - f1-score (micro avg) 0.7568 |
|
2023-10-25 18:40:50,846 saving best model |
|
2023-10-25 18:40:51,544 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:41:00,933 epoch 3 - iter 178/1786 - loss 0.07216058 - time (sec): 9.39 - samples/sec: 2668.52 - lr: 0.000044 - momentum: 0.000000 |
|
2023-10-25 18:41:11,563 epoch 3 - iter 356/1786 - loss 0.08332331 - time (sec): 20.02 - samples/sec: 2458.48 - lr: 0.000043 - momentum: 0.000000 |
|
2023-10-25 18:41:21,065 epoch 3 - iter 534/1786 - loss 0.08767583 - time (sec): 29.52 - samples/sec: 2493.08 - lr: 0.000043 - momentum: 0.000000 |
|
2023-10-25 18:41:30,647 epoch 3 - iter 712/1786 - loss 0.08415718 - time (sec): 39.10 - samples/sec: 2524.84 - lr: 0.000042 - momentum: 0.000000 |
|
2023-10-25 18:41:40,016 epoch 3 - iter 890/1786 - loss 0.08739794 - time (sec): 48.47 - samples/sec: 2552.95 - lr: 0.000042 - momentum: 0.000000 |
|
2023-10-25 18:41:49,509 epoch 3 - iter 1068/1786 - loss 0.08687276 - time (sec): 57.96 - samples/sec: 2572.12 - lr: 0.000041 - momentum: 0.000000 |
|
2023-10-25 18:41:59,111 epoch 3 - iter 1246/1786 - loss 0.08597967 - time (sec): 67.57 - samples/sec: 2587.39 - lr: 0.000041 - momentum: 0.000000 |
|
2023-10-25 18:42:08,462 epoch 3 - iter 1424/1786 - loss 0.08636941 - time (sec): 76.92 - samples/sec: 2559.26 - lr: 0.000040 - momentum: 0.000000 |
|
2023-10-25 18:42:17,442 epoch 3 - iter 1602/1786 - loss 0.08593164 - time (sec): 85.90 - samples/sec: 2587.64 - lr: 0.000039 - momentum: 0.000000 |
|
2023-10-25 18:42:26,319 epoch 3 - iter 1780/1786 - loss 0.08507425 - time (sec): 94.77 - samples/sec: 2616.24 - lr: 0.000039 - momentum: 0.000000 |
|
2023-10-25 18:42:26,608 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:42:26,608 EPOCH 3 done: loss 0.0850 - lr: 0.000039 |
|
2023-10-25 18:42:30,919 DEV : loss 0.13416369259357452 - f1-score (micro avg) 0.7734 |
|
2023-10-25 18:42:30,941 saving best model |
|
2023-10-25 18:42:31,602 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:42:40,853 epoch 4 - iter 178/1786 - loss 0.05392007 - time (sec): 9.25 - samples/sec: 2660.38 - lr: 0.000038 - momentum: 0.000000 |
|
2023-10-25 18:42:50,351 epoch 4 - iter 356/1786 - loss 0.05706668 - time (sec): 18.75 - samples/sec: 2686.77 - lr: 0.000038 - momentum: 0.000000 |
|
2023-10-25 18:42:59,263 epoch 4 - iter 534/1786 - loss 0.06080001 - time (sec): 27.66 - samples/sec: 2693.61 - lr: 0.000037 - momentum: 0.000000 |
|
2023-10-25 18:43:08,040 epoch 4 - iter 712/1786 - loss 0.06411629 - time (sec): 36.44 - samples/sec: 2714.05 - lr: 0.000037 - momentum: 0.000000 |
|
2023-10-25 18:43:17,331 epoch 4 - iter 890/1786 - loss 0.06293852 - time (sec): 45.73 - samples/sec: 2693.11 - lr: 0.000036 - momentum: 0.000000 |
|
2023-10-25 18:43:26,753 epoch 4 - iter 1068/1786 - loss 0.06522331 - time (sec): 55.15 - samples/sec: 2696.19 - lr: 0.000036 - momentum: 0.000000 |
|
2023-10-25 18:43:35,821 epoch 4 - iter 1246/1786 - loss 0.06635384 - time (sec): 64.22 - samples/sec: 2691.82 - lr: 0.000035 - momentum: 0.000000 |
|
2023-10-25 18:43:44,420 epoch 4 - iter 1424/1786 - loss 0.06654842 - time (sec): 72.82 - samples/sec: 2724.98 - lr: 0.000034 - momentum: 0.000000 |
|
2023-10-25 18:43:53,437 epoch 4 - iter 1602/1786 - loss 0.06525833 - time (sec): 81.83 - samples/sec: 2730.35 - lr: 0.000034 - momentum: 0.000000 |
|
2023-10-25 18:44:02,280 epoch 4 - iter 1780/1786 - loss 0.06417141 - time (sec): 90.68 - samples/sec: 2735.96 - lr: 0.000033 - momentum: 0.000000 |
|
2023-10-25 18:44:02,561 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:44:02,561 EPOCH 4 done: loss 0.0641 - lr: 0.000033 |
|
2023-10-25 18:44:07,999 DEV : loss 0.1622055321931839 - f1-score (micro avg) 0.7949 |
|
2023-10-25 18:44:08,021 saving best model |
|
2023-10-25 18:44:08,751 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:44:18,039 epoch 5 - iter 178/1786 - loss 0.05453094 - time (sec): 9.29 - samples/sec: 2544.86 - lr: 0.000033 - momentum: 0.000000 |
|
2023-10-25 18:44:27,561 epoch 5 - iter 356/1786 - loss 0.04577426 - time (sec): 18.81 - samples/sec: 2676.28 - lr: 0.000032 - momentum: 0.000000 |
|
2023-10-25 18:44:37,038 epoch 5 - iter 534/1786 - loss 0.04807378 - time (sec): 28.28 - samples/sec: 2655.55 - lr: 0.000032 - momentum: 0.000000 |
|
2023-10-25 18:44:46,571 epoch 5 - iter 712/1786 - loss 0.04717079 - time (sec): 37.82 - samples/sec: 2653.43 - lr: 0.000031 - momentum: 0.000000 |
|
2023-10-25 18:44:56,134 epoch 5 - iter 890/1786 - loss 0.04803019 - time (sec): 47.38 - samples/sec: 2655.16 - lr: 0.000031 - momentum: 0.000000 |
|
2023-10-25 18:45:05,721 epoch 5 - iter 1068/1786 - loss 0.04817761 - time (sec): 56.97 - samples/sec: 2618.31 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-25 18:45:15,218 epoch 5 - iter 1246/1786 - loss 0.04776400 - time (sec): 66.46 - samples/sec: 2624.83 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-25 18:45:24,767 epoch 5 - iter 1424/1786 - loss 0.04756135 - time (sec): 76.01 - samples/sec: 2611.00 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-25 18:45:34,208 epoch 5 - iter 1602/1786 - loss 0.04690501 - time (sec): 85.46 - samples/sec: 2596.86 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-25 18:45:43,226 epoch 5 - iter 1780/1786 - loss 0.04701536 - time (sec): 94.47 - samples/sec: 2622.82 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-25 18:45:43,541 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:45:43,541 EPOCH 5 done: loss 0.0471 - lr: 0.000028 |
|
2023-10-25 18:45:47,551 DEV : loss 0.17839400470256805 - f1-score (micro avg) 0.7641 |
|
2023-10-25 18:45:47,571 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:45:57,008 epoch 6 - iter 178/1786 - loss 0.04739676 - time (sec): 9.44 - samples/sec: 2669.74 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-25 18:46:06,535 epoch 6 - iter 356/1786 - loss 0.03785218 - time (sec): 18.96 - samples/sec: 2692.43 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-25 18:46:15,980 epoch 6 - iter 534/1786 - loss 0.03634755 - time (sec): 28.41 - samples/sec: 2618.52 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-25 18:46:25,654 epoch 6 - iter 712/1786 - loss 0.04044514 - time (sec): 38.08 - samples/sec: 2607.08 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-25 18:46:35,381 epoch 6 - iter 890/1786 - loss 0.03838047 - time (sec): 47.81 - samples/sec: 2615.00 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-25 18:46:44,873 epoch 6 - iter 1068/1786 - loss 0.03827037 - time (sec): 57.30 - samples/sec: 2616.02 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-25 18:46:54,684 epoch 6 - iter 1246/1786 - loss 0.03744695 - time (sec): 67.11 - samples/sec: 2607.36 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-25 18:47:03,908 epoch 6 - iter 1424/1786 - loss 0.03713239 - time (sec): 76.34 - samples/sec: 2598.15 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-25 18:47:13,637 epoch 6 - iter 1602/1786 - loss 0.03788438 - time (sec): 86.06 - samples/sec: 2617.86 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-25 18:47:22,668 epoch 6 - iter 1780/1786 - loss 0.03775238 - time (sec): 95.10 - samples/sec: 2609.10 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-25 18:47:22,974 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:47:22,975 EPOCH 6 done: loss 0.0377 - lr: 0.000022 |
|
2023-10-25 18:47:27,370 DEV : loss 0.17886780202388763 - f1-score (micro avg) 0.7886 |
|
2023-10-25 18:47:27,392 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:47:37,024 epoch 7 - iter 178/1786 - loss 0.03077925 - time (sec): 9.63 - samples/sec: 2692.33 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-25 18:47:46,517 epoch 7 - iter 356/1786 - loss 0.02932877 - time (sec): 19.12 - samples/sec: 2574.46 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-25 18:47:55,977 epoch 7 - iter 534/1786 - loss 0.02829002 - time (sec): 28.58 - samples/sec: 2597.22 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-25 18:48:05,423 epoch 7 - iter 712/1786 - loss 0.02973574 - time (sec): 38.03 - samples/sec: 2597.81 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-25 18:48:14,909 epoch 7 - iter 890/1786 - loss 0.02980090 - time (sec): 47.52 - samples/sec: 2639.78 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-25 18:48:24,135 epoch 7 - iter 1068/1786 - loss 0.03007659 - time (sec): 56.74 - samples/sec: 2657.82 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-25 18:48:33,402 epoch 7 - iter 1246/1786 - loss 0.02862806 - time (sec): 66.01 - samples/sec: 2678.20 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-25 18:48:42,731 epoch 7 - iter 1424/1786 - loss 0.02826589 - time (sec): 75.34 - samples/sec: 2639.07 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-25 18:48:51,933 epoch 7 - iter 1602/1786 - loss 0.02797061 - time (sec): 84.54 - samples/sec: 2641.63 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-25 18:49:01,510 epoch 7 - iter 1780/1786 - loss 0.02819106 - time (sec): 94.12 - samples/sec: 2632.82 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-25 18:49:01,831 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:49:01,831 EPOCH 7 done: loss 0.0281 - lr: 0.000017 |
|
2023-10-25 18:49:06,709 DEV : loss 0.19223909080028534 - f1-score (micro avg) 0.7857 |
|
2023-10-25 18:49:06,731 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:49:16,260 epoch 8 - iter 178/1786 - loss 0.02174503 - time (sec): 9.53 - samples/sec: 2599.42 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-25 18:49:25,630 epoch 8 - iter 356/1786 - loss 0.01913432 - time (sec): 18.90 - samples/sec: 2564.11 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-25 18:49:35,140 epoch 8 - iter 534/1786 - loss 0.01746195 - time (sec): 28.41 - samples/sec: 2589.15 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-25 18:49:44,521 epoch 8 - iter 712/1786 - loss 0.01693117 - time (sec): 37.79 - samples/sec: 2623.00 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-25 18:49:53,894 epoch 8 - iter 890/1786 - loss 0.01671270 - time (sec): 47.16 - samples/sec: 2621.55 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-25 18:50:03,220 epoch 8 - iter 1068/1786 - loss 0.01731034 - time (sec): 56.49 - samples/sec: 2602.71 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-25 18:50:12,498 epoch 8 - iter 1246/1786 - loss 0.01886582 - time (sec): 65.77 - samples/sec: 2608.65 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-25 18:50:21,455 epoch 8 - iter 1424/1786 - loss 0.01893327 - time (sec): 74.72 - samples/sec: 2631.11 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-25 18:50:30,440 epoch 8 - iter 1602/1786 - loss 0.01901785 - time (sec): 83.71 - samples/sec: 2666.98 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-25 18:50:39,483 epoch 8 - iter 1780/1786 - loss 0.01903471 - time (sec): 92.75 - samples/sec: 2673.79 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-25 18:50:39,785 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:50:39,785 EPOCH 8 done: loss 0.0190 - lr: 0.000011 |
|
2023-10-25 18:50:43,730 DEV : loss 0.22295665740966797 - f1-score (micro avg) 0.7927 |
|
2023-10-25 18:50:43,754 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:50:53,379 epoch 9 - iter 178/1786 - loss 0.00894777 - time (sec): 9.62 - samples/sec: 2649.14 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-25 18:51:02,883 epoch 9 - iter 356/1786 - loss 0.01120950 - time (sec): 19.13 - samples/sec: 2527.08 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-25 18:51:12,372 epoch 9 - iter 534/1786 - loss 0.01367811 - time (sec): 28.62 - samples/sec: 2533.14 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-25 18:51:21,536 epoch 9 - iter 712/1786 - loss 0.01359022 - time (sec): 37.78 - samples/sec: 2567.40 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-25 18:51:30,710 epoch 9 - iter 890/1786 - loss 0.01350802 - time (sec): 46.95 - samples/sec: 2560.14 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-25 18:51:39,721 epoch 9 - iter 1068/1786 - loss 0.01419765 - time (sec): 55.97 - samples/sec: 2623.59 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-25 18:51:48,814 epoch 9 - iter 1246/1786 - loss 0.01486049 - time (sec): 65.06 - samples/sec: 2629.79 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-25 18:51:57,499 epoch 9 - iter 1424/1786 - loss 0.01410719 - time (sec): 73.74 - samples/sec: 2666.03 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-25 18:52:06,116 epoch 9 - iter 1602/1786 - loss 0.01367858 - time (sec): 82.36 - samples/sec: 2689.33 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-25 18:52:15,069 epoch 9 - iter 1780/1786 - loss 0.01309891 - time (sec): 91.31 - samples/sec: 2717.19 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-25 18:52:15,368 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:52:15,369 EPOCH 9 done: loss 0.0132 - lr: 0.000006 |
|
2023-10-25 18:52:20,439 DEV : loss 0.22367699444293976 - f1-score (micro avg) 0.7949 |
|
2023-10-25 18:52:20,459 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:52:29,298 epoch 10 - iter 178/1786 - loss 0.00880098 - time (sec): 8.84 - samples/sec: 2842.17 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-25 18:52:37,837 epoch 10 - iter 356/1786 - loss 0.00764413 - time (sec): 17.38 - samples/sec: 2803.77 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-25 18:52:46,910 epoch 10 - iter 534/1786 - loss 0.00803735 - time (sec): 26.45 - samples/sec: 2802.88 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-25 18:52:56,296 epoch 10 - iter 712/1786 - loss 0.00807527 - time (sec): 35.84 - samples/sec: 2775.80 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-25 18:53:05,399 epoch 10 - iter 890/1786 - loss 0.00741347 - time (sec): 44.94 - samples/sec: 2772.21 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-25 18:53:14,850 epoch 10 - iter 1068/1786 - loss 0.00783141 - time (sec): 54.39 - samples/sec: 2736.83 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-25 18:53:24,330 epoch 10 - iter 1246/1786 - loss 0.00870824 - time (sec): 63.87 - samples/sec: 2711.41 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-25 18:53:33,732 epoch 10 - iter 1424/1786 - loss 0.00816998 - time (sec): 73.27 - samples/sec: 2716.97 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-25 18:53:43,352 epoch 10 - iter 1602/1786 - loss 0.00820551 - time (sec): 82.89 - samples/sec: 2698.61 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-25 18:53:52,998 epoch 10 - iter 1780/1786 - loss 0.00861252 - time (sec): 92.54 - samples/sec: 2679.83 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-25 18:53:53,317 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:53:53,317 EPOCH 10 done: loss 0.0086 - lr: 0.000000 |
|
2023-10-25 18:53:58,167 DEV : loss 0.22940541803836823 - f1-score (micro avg) 0.792 |
|
2023-10-25 18:53:58,699 ---------------------------------------------------------------------------------------------------- |
|
2023-10-25 18:53:58,701 Loading model from best epoch ... |
|
2023-10-25 18:54:00,626 SequenceTagger predicts: Dictionary with 17 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd |
|
2023-10-25 18:54:13,345 |
|
Results: |
|
- F-score (micro) 0.6526 |
|
- F-score (macro) 0.5814 |
|
- Accuracy 0.5009 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
LOC 0.6415 0.6521 0.6467 1095 |
|
PER 0.7798 0.7312 0.7547 1012 |
|
ORG 0.3913 0.5294 0.4500 357 |
|
HumanProd 0.3594 0.6970 0.4742 33 |
|
|
|
micro avg 0.6386 0.6672 0.6526 2497 |
|
macro avg 0.5430 0.6524 0.5814 2497 |
|
weighted avg 0.6580 0.6672 0.6601 2497 |
|
|
|
2023-10-25 18:54:13,345 ---------------------------------------------------------------------------------------------------- |
|
|