|
2023-10-18 16:38:24,525 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,525 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 128) |
|
(position_embeddings): Embedding(512, 128) |
|
(token_type_embeddings): Embedding(2, 128) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-1): 2 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=128, out_features=128, bias=True) |
|
(key): Linear(in_features=128, out_features=128, bias=True) |
|
(value): Linear(in_features=128, out_features=128, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=128, out_features=512, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=512, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=128, out_features=25, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-18 16:38:24,525 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,525 MultiCorpus: 966 train + 219 dev + 204 test sentences |
|
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator |
|
2023-10-18 16:38:24,525 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,525 Train: 966 sentences |
|
2023-10-18 16:38:24,526 (train_with_dev=False, train_with_test=False) |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 Training Params: |
|
2023-10-18 16:38:24,526 - learning_rate: "3e-05" |
|
2023-10-18 16:38:24,526 - mini_batch_size: "4" |
|
2023-10-18 16:38:24,526 - max_epochs: "10" |
|
2023-10-18 16:38:24,526 - shuffle: "True" |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 Plugins: |
|
2023-10-18 16:38:24,526 - TensorboardLogger |
|
2023-10-18 16:38:24,526 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-18 16:38:24,526 - metric: "('micro avg', 'f1-score')" |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 Computation: |
|
2023-10-18 16:38:24,526 - compute on device: cuda:0 |
|
2023-10-18 16:38:24,526 - embedding storage: none |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 Model training base path: "hmbench-ajmc/fr-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2" |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:24,526 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-18 16:38:24,884 epoch 1 - iter 24/242 - loss 3.35126374 - time (sec): 0.36 - samples/sec: 6765.63 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-18 16:38:25,247 epoch 1 - iter 48/242 - loss 3.31047904 - time (sec): 0.72 - samples/sec: 6553.45 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-18 16:38:25,599 epoch 1 - iter 72/242 - loss 3.26248870 - time (sec): 1.07 - samples/sec: 6667.17 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-18 16:38:25,959 epoch 1 - iter 96/242 - loss 3.14797316 - time (sec): 1.43 - samples/sec: 6576.40 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-18 16:38:26,351 epoch 1 - iter 120/242 - loss 2.99012673 - time (sec): 1.82 - samples/sec: 6774.33 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-18 16:38:26,734 epoch 1 - iter 144/242 - loss 2.81396511 - time (sec): 2.21 - samples/sec: 6888.46 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-18 16:38:27,097 epoch 1 - iter 168/242 - loss 2.64336521 - time (sec): 2.57 - samples/sec: 6846.40 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-18 16:38:27,476 epoch 1 - iter 192/242 - loss 2.45460226 - time (sec): 2.95 - samples/sec: 6850.33 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-18 16:38:27,831 epoch 1 - iter 216/242 - loss 2.29481844 - time (sec): 3.30 - samples/sec: 6783.82 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-18 16:38:28,189 epoch 1 - iter 240/242 - loss 2.16749783 - time (sec): 3.66 - samples/sec: 6722.50 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-18 16:38:28,219 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:28,219 EPOCH 1 done: loss 2.1607 - lr: 0.000030 |
|
2023-10-18 16:38:28,602 DEV : loss 0.6777684092521667 - f1-score (micro avg) 0.0 |
|
2023-10-18 16:38:28,606 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:28,967 epoch 2 - iter 24/242 - loss 0.72505823 - time (sec): 0.36 - samples/sec: 6433.79 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-18 16:38:29,326 epoch 2 - iter 48/242 - loss 0.72322092 - time (sec): 0.72 - samples/sec: 6766.25 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-18 16:38:29,697 epoch 2 - iter 72/242 - loss 0.69582817 - time (sec): 1.09 - samples/sec: 6837.35 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-18 16:38:30,067 epoch 2 - iter 96/242 - loss 0.72312025 - time (sec): 1.46 - samples/sec: 6853.36 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-18 16:38:30,435 epoch 2 - iter 120/242 - loss 0.72195112 - time (sec): 1.83 - samples/sec: 6864.77 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-18 16:38:30,789 epoch 2 - iter 144/242 - loss 0.70636652 - time (sec): 2.18 - samples/sec: 6897.08 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-18 16:38:31,165 epoch 2 - iter 168/242 - loss 0.68918084 - time (sec): 2.56 - samples/sec: 6810.98 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-18 16:38:31,540 epoch 2 - iter 192/242 - loss 0.67373862 - time (sec): 2.93 - samples/sec: 6690.09 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-18 16:38:31,911 epoch 2 - iter 216/242 - loss 0.66394003 - time (sec): 3.30 - samples/sec: 6674.49 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-18 16:38:32,287 epoch 2 - iter 240/242 - loss 0.66079545 - time (sec): 3.68 - samples/sec: 6688.51 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-18 16:38:32,312 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:32,312 EPOCH 2 done: loss 0.6610 - lr: 0.000027 |
|
2023-10-18 16:38:32,742 DEV : loss 0.49618253111839294 - f1-score (micro avg) 0.0495 |
|
2023-10-18 16:38:32,748 saving best model |
|
2023-10-18 16:38:32,777 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:33,199 epoch 3 - iter 24/242 - loss 0.55177418 - time (sec): 0.42 - samples/sec: 6274.20 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-18 16:38:33,625 epoch 3 - iter 48/242 - loss 0.56260011 - time (sec): 0.85 - samples/sec: 5982.17 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-18 16:38:34,049 epoch 3 - iter 72/242 - loss 0.58426111 - time (sec): 1.27 - samples/sec: 6167.74 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-18 16:38:34,430 epoch 3 - iter 96/242 - loss 0.58380777 - time (sec): 1.65 - samples/sec: 6025.95 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-18 16:38:34,801 epoch 3 - iter 120/242 - loss 0.58477258 - time (sec): 2.02 - samples/sec: 6067.56 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-18 16:38:35,180 epoch 3 - iter 144/242 - loss 0.56738685 - time (sec): 2.40 - samples/sec: 6191.59 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-18 16:38:35,538 epoch 3 - iter 168/242 - loss 0.56615814 - time (sec): 2.76 - samples/sec: 6265.47 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-18 16:38:35,913 epoch 3 - iter 192/242 - loss 0.53753547 - time (sec): 3.14 - samples/sec: 6363.20 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-18 16:38:36,296 epoch 3 - iter 216/242 - loss 0.52927563 - time (sec): 3.52 - samples/sec: 6335.26 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-18 16:38:36,662 epoch 3 - iter 240/242 - loss 0.52255138 - time (sec): 3.88 - samples/sec: 6348.60 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-18 16:38:36,688 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:36,688 EPOCH 3 done: loss 0.5233 - lr: 0.000023 |
|
2023-10-18 16:38:37,107 DEV : loss 0.4014001190662384 - f1-score (micro avg) 0.4295 |
|
2023-10-18 16:38:37,112 saving best model |
|
2023-10-18 16:38:37,145 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:37,537 epoch 4 - iter 24/242 - loss 0.59597425 - time (sec): 0.39 - samples/sec: 6593.50 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-18 16:38:37,922 epoch 4 - iter 48/242 - loss 0.50289897 - time (sec): 0.78 - samples/sec: 6712.54 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-18 16:38:38,319 epoch 4 - iter 72/242 - loss 0.49797384 - time (sec): 1.17 - samples/sec: 6518.76 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-18 16:38:38,686 epoch 4 - iter 96/242 - loss 0.47126303 - time (sec): 1.54 - samples/sec: 6375.18 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-18 16:38:39,078 epoch 4 - iter 120/242 - loss 0.46859855 - time (sec): 1.93 - samples/sec: 6440.85 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-18 16:38:39,444 epoch 4 - iter 144/242 - loss 0.45417748 - time (sec): 2.30 - samples/sec: 6436.19 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-18 16:38:39,814 epoch 4 - iter 168/242 - loss 0.44506682 - time (sec): 2.67 - samples/sec: 6384.32 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-18 16:38:40,182 epoch 4 - iter 192/242 - loss 0.44527360 - time (sec): 3.04 - samples/sec: 6383.45 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-18 16:38:40,558 epoch 4 - iter 216/242 - loss 0.44824678 - time (sec): 3.41 - samples/sec: 6454.56 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-18 16:38:40,931 epoch 4 - iter 240/242 - loss 0.44635771 - time (sec): 3.78 - samples/sec: 6505.73 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-18 16:38:40,956 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:40,956 EPOCH 4 done: loss 0.4451 - lr: 0.000020 |
|
2023-10-18 16:38:41,378 DEV : loss 0.3599375784397125 - f1-score (micro avg) 0.4697 |
|
2023-10-18 16:38:41,383 saving best model |
|
2023-10-18 16:38:41,416 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:41,793 epoch 5 - iter 24/242 - loss 0.41121412 - time (sec): 0.38 - samples/sec: 6822.90 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-18 16:38:42,173 epoch 5 - iter 48/242 - loss 0.41252866 - time (sec): 0.76 - samples/sec: 6853.59 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-18 16:38:42,542 epoch 5 - iter 72/242 - loss 0.39242649 - time (sec): 1.12 - samples/sec: 6664.38 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-18 16:38:42,915 epoch 5 - iter 96/242 - loss 0.40193281 - time (sec): 1.50 - samples/sec: 6589.78 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-18 16:38:43,292 epoch 5 - iter 120/242 - loss 0.41923305 - time (sec): 1.87 - samples/sec: 6651.81 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-18 16:38:43,662 epoch 5 - iter 144/242 - loss 0.41653879 - time (sec): 2.25 - samples/sec: 6573.59 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-18 16:38:44,032 epoch 5 - iter 168/242 - loss 0.41609853 - time (sec): 2.62 - samples/sec: 6599.92 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-18 16:38:44,405 epoch 5 - iter 192/242 - loss 0.41237405 - time (sec): 2.99 - samples/sec: 6527.73 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-18 16:38:44,774 epoch 5 - iter 216/242 - loss 0.40993176 - time (sec): 3.36 - samples/sec: 6546.29 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-18 16:38:45,141 epoch 5 - iter 240/242 - loss 0.40875524 - time (sec): 3.72 - samples/sec: 6591.33 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-18 16:38:45,169 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:45,170 EPOCH 5 done: loss 0.4074 - lr: 0.000017 |
|
2023-10-18 16:38:45,596 DEV : loss 0.31972187757492065 - f1-score (micro avg) 0.4691 |
|
2023-10-18 16:38:45,600 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:45,959 epoch 6 - iter 24/242 - loss 0.43297691 - time (sec): 0.36 - samples/sec: 5599.64 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-18 16:38:46,337 epoch 6 - iter 48/242 - loss 0.42272237 - time (sec): 0.74 - samples/sec: 6003.79 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-18 16:38:46,710 epoch 6 - iter 72/242 - loss 0.39624650 - time (sec): 1.11 - samples/sec: 6225.39 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-18 16:38:47,108 epoch 6 - iter 96/242 - loss 0.37477953 - time (sec): 1.51 - samples/sec: 6336.24 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-18 16:38:47,479 epoch 6 - iter 120/242 - loss 0.39971700 - time (sec): 1.88 - samples/sec: 6470.88 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-18 16:38:47,863 epoch 6 - iter 144/242 - loss 0.40296553 - time (sec): 2.26 - samples/sec: 6436.84 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-18 16:38:48,247 epoch 6 - iter 168/242 - loss 0.40258962 - time (sec): 2.65 - samples/sec: 6438.95 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-18 16:38:48,635 epoch 6 - iter 192/242 - loss 0.39503026 - time (sec): 3.03 - samples/sec: 6501.61 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-18 16:38:49,006 epoch 6 - iter 216/242 - loss 0.38761649 - time (sec): 3.41 - samples/sec: 6483.88 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-18 16:38:49,372 epoch 6 - iter 240/242 - loss 0.38737518 - time (sec): 3.77 - samples/sec: 6523.07 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-18 16:38:49,402 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:49,402 EPOCH 6 done: loss 0.3859 - lr: 0.000013 |
|
2023-10-18 16:38:49,838 DEV : loss 0.3061554729938507 - f1-score (micro avg) 0.4675 |
|
2023-10-18 16:38:49,843 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:50,208 epoch 7 - iter 24/242 - loss 0.37232522 - time (sec): 0.36 - samples/sec: 6043.85 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-18 16:38:50,576 epoch 7 - iter 48/242 - loss 0.34670638 - time (sec): 0.73 - samples/sec: 6194.59 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-18 16:38:50,955 epoch 7 - iter 72/242 - loss 0.35643935 - time (sec): 1.11 - samples/sec: 6332.64 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-18 16:38:51,343 epoch 7 - iter 96/242 - loss 0.34505139 - time (sec): 1.50 - samples/sec: 6448.29 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-18 16:38:51,728 epoch 7 - iter 120/242 - loss 0.34860073 - time (sec): 1.88 - samples/sec: 6357.43 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-18 16:38:52,073 epoch 7 - iter 144/242 - loss 0.34249545 - time (sec): 2.23 - samples/sec: 6513.96 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-18 16:38:52,389 epoch 7 - iter 168/242 - loss 0.33903240 - time (sec): 2.55 - samples/sec: 6619.48 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-18 16:38:52,728 epoch 7 - iter 192/242 - loss 0.33492442 - time (sec): 2.88 - samples/sec: 6711.55 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-18 16:38:53,117 epoch 7 - iter 216/242 - loss 0.34168901 - time (sec): 3.27 - samples/sec: 6759.54 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-18 16:38:53,505 epoch 7 - iter 240/242 - loss 0.35207099 - time (sec): 3.66 - samples/sec: 6728.77 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-18 16:38:53,532 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:53,532 EPOCH 7 done: loss 0.3516 - lr: 0.000010 |
|
2023-10-18 16:38:54,038 DEV : loss 0.2853192090988159 - f1-score (micro avg) 0.4732 |
|
2023-10-18 16:38:54,045 saving best model |
|
2023-10-18 16:38:54,087 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:54,455 epoch 8 - iter 24/242 - loss 0.39458741 - time (sec): 0.37 - samples/sec: 7104.32 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-18 16:38:54,766 epoch 8 - iter 48/242 - loss 0.38229970 - time (sec): 0.68 - samples/sec: 7009.39 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-18 16:38:55,061 epoch 8 - iter 72/242 - loss 0.36407237 - time (sec): 0.97 - samples/sec: 7455.99 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-18 16:38:55,383 epoch 8 - iter 96/242 - loss 0.36252774 - time (sec): 1.30 - samples/sec: 7317.55 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-18 16:38:55,766 epoch 8 - iter 120/242 - loss 0.35725645 - time (sec): 1.68 - samples/sec: 6985.47 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-18 16:38:56,159 epoch 8 - iter 144/242 - loss 0.36044754 - time (sec): 2.07 - samples/sec: 6984.32 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-18 16:38:56,546 epoch 8 - iter 168/242 - loss 0.35849973 - time (sec): 2.46 - samples/sec: 6962.73 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-18 16:38:56,909 epoch 8 - iter 192/242 - loss 0.35091677 - time (sec): 2.82 - samples/sec: 6877.49 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-18 16:38:57,277 epoch 8 - iter 216/242 - loss 0.34416714 - time (sec): 3.19 - samples/sec: 6851.10 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-18 16:38:57,671 epoch 8 - iter 240/242 - loss 0.35298905 - time (sec): 3.58 - samples/sec: 6876.09 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-18 16:38:57,698 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:57,698 EPOCH 8 done: loss 0.3526 - lr: 0.000007 |
|
2023-10-18 16:38:58,135 DEV : loss 0.290330708026886 - f1-score (micro avg) 0.4788 |
|
2023-10-18 16:38:58,140 saving best model |
|
2023-10-18 16:38:58,175 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:38:58,550 epoch 9 - iter 24/242 - loss 0.26842047 - time (sec): 0.37 - samples/sec: 6509.42 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-18 16:38:58,927 epoch 9 - iter 48/242 - loss 0.28082515 - time (sec): 0.75 - samples/sec: 6674.88 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-18 16:38:59,321 epoch 9 - iter 72/242 - loss 0.31753212 - time (sec): 1.14 - samples/sec: 6767.22 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-18 16:38:59,693 epoch 9 - iter 96/242 - loss 0.34317593 - time (sec): 1.52 - samples/sec: 6633.11 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-18 16:39:00,082 epoch 9 - iter 120/242 - loss 0.33649528 - time (sec): 1.91 - samples/sec: 6632.01 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-18 16:39:00,461 epoch 9 - iter 144/242 - loss 0.34597919 - time (sec): 2.29 - samples/sec: 6641.47 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-18 16:39:00,832 epoch 9 - iter 168/242 - loss 0.34031526 - time (sec): 2.66 - samples/sec: 6599.02 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-18 16:39:01,219 epoch 9 - iter 192/242 - loss 0.34914127 - time (sec): 3.04 - samples/sec: 6514.33 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-18 16:39:01,612 epoch 9 - iter 216/242 - loss 0.34255318 - time (sec): 3.44 - samples/sec: 6490.47 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-18 16:39:01,980 epoch 9 - iter 240/242 - loss 0.34012786 - time (sec): 3.80 - samples/sec: 6468.92 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-18 16:39:02,008 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:39:02,008 EPOCH 9 done: loss 0.3404 - lr: 0.000003 |
|
2023-10-18 16:39:02,443 DEV : loss 0.2772204279899597 - f1-score (micro avg) 0.4894 |
|
2023-10-18 16:39:02,447 saving best model |
|
2023-10-18 16:39:02,484 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:39:02,859 epoch 10 - iter 24/242 - loss 0.25690412 - time (sec): 0.37 - samples/sec: 6386.28 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-18 16:39:03,226 epoch 10 - iter 48/242 - loss 0.29962382 - time (sec): 0.74 - samples/sec: 6449.68 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-18 16:39:03,588 epoch 10 - iter 72/242 - loss 0.31711743 - time (sec): 1.10 - samples/sec: 6600.08 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-18 16:39:03,976 epoch 10 - iter 96/242 - loss 0.32832182 - time (sec): 1.49 - samples/sec: 6491.54 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-18 16:39:04,343 epoch 10 - iter 120/242 - loss 0.34138987 - time (sec): 1.86 - samples/sec: 6451.83 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-18 16:39:04,694 epoch 10 - iter 144/242 - loss 0.34136384 - time (sec): 2.21 - samples/sec: 6419.23 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-18 16:39:05,039 epoch 10 - iter 168/242 - loss 0.33997769 - time (sec): 2.55 - samples/sec: 6610.63 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-18 16:39:05,398 epoch 10 - iter 192/242 - loss 0.33163190 - time (sec): 2.91 - samples/sec: 6665.11 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-18 16:39:05,781 epoch 10 - iter 216/242 - loss 0.33569214 - time (sec): 3.30 - samples/sec: 6637.06 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-18 16:39:06,160 epoch 10 - iter 240/242 - loss 0.33344942 - time (sec): 3.68 - samples/sec: 6676.39 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-18 16:39:06,192 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:39:06,192 EPOCH 10 done: loss 0.3351 - lr: 0.000000 |
|
2023-10-18 16:39:06,623 DEV : loss 0.2765982151031494 - f1-score (micro avg) 0.4944 |
|
2023-10-18 16:39:06,628 saving best model |
|
2023-10-18 16:39:06,692 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 16:39:06,692 Loading model from best epoch ... |
|
2023-10-18 16:39:06,777 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date |
|
2023-10-18 16:39:07,201 |
|
Results: |
|
- F-score (micro) 0.4472 |
|
- F-score (macro) 0.2097 |
|
- Accuracy 0.3044 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
pers 0.5137 0.6763 0.5839 139 |
|
scope 0.3723 0.5426 0.4416 129 |
|
work 0.1429 0.0125 0.0230 80 |
|
loc 0.0000 0.0000 0.0000 9 |
|
date 0.0000 0.0000 0.0000 3 |
|
|
|
micro avg 0.4365 0.4583 0.4472 360 |
|
macro avg 0.2058 0.2463 0.2097 360 |
|
weighted avg 0.3635 0.4583 0.3888 360 |
|
|
|
2023-10-18 16:39:07,201 ---------------------------------------------------------------------------------------------------- |
|
|