stefan-it's picture
Upload ./training.log with huggingface_hub
5b29783
raw
history blame
37 kB
2023-10-24 13:12:30,385 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,386 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(1): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(2): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(3): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(4): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(5): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(6): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(7): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(8): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(9): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(10): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(11): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-24 13:12:30,386 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,386 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences
- NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator
2023-10-24 13:12:30,386 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,386 Train: 5901 sentences
2023-10-24 13:12:30,386 (train_with_dev=False, train_with_test=False)
2023-10-24 13:12:30,386 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,386 Training Params:
2023-10-24 13:12:30,386 - learning_rate: "3e-05"
2023-10-24 13:12:30,386 - mini_batch_size: "8"
2023-10-24 13:12:30,386 - max_epochs: "10"
2023-10-24 13:12:30,386 - shuffle: "True"
2023-10-24 13:12:30,386 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,386 Plugins:
2023-10-24 13:12:30,387 - TensorboardLogger
2023-10-24 13:12:30,387 - LinearScheduler | warmup_fraction: '0.1'
2023-10-24 13:12:30,387 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,387 Final evaluation on model from best epoch (best-model.pt)
2023-10-24 13:12:30,387 - metric: "('micro avg', 'f1-score')"
2023-10-24 13:12:30,387 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,387 Computation:
2023-10-24 13:12:30,387 - compute on device: cuda:0
2023-10-24 13:12:30,387 - embedding storage: none
2023-10-24 13:12:30,387 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,387 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5"
2023-10-24 13:12:30,387 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,387 ----------------------------------------------------------------------------------------------------
2023-10-24 13:12:30,387 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-24 13:12:37,067 epoch 1 - iter 73/738 - loss 2.18161987 - time (sec): 6.68 - samples/sec: 2354.92 - lr: 0.000003 - momentum: 0.000000
2023-10-24 13:12:44,149 epoch 1 - iter 146/738 - loss 1.42985226 - time (sec): 13.76 - samples/sec: 2283.70 - lr: 0.000006 - momentum: 0.000000
2023-10-24 13:12:50,831 epoch 1 - iter 219/738 - loss 1.10599054 - time (sec): 20.44 - samples/sec: 2291.46 - lr: 0.000009 - momentum: 0.000000
2023-10-24 13:12:57,060 epoch 1 - iter 292/738 - loss 0.91769116 - time (sec): 26.67 - samples/sec: 2322.65 - lr: 0.000012 - momentum: 0.000000
2023-10-24 13:13:05,191 epoch 1 - iter 365/738 - loss 0.77452000 - time (sec): 34.80 - samples/sec: 2327.75 - lr: 0.000015 - momentum: 0.000000
2023-10-24 13:13:11,963 epoch 1 - iter 438/738 - loss 0.68331020 - time (sec): 41.58 - samples/sec: 2359.17 - lr: 0.000018 - momentum: 0.000000
2023-10-24 13:13:19,168 epoch 1 - iter 511/738 - loss 0.60818039 - time (sec): 48.78 - samples/sec: 2365.23 - lr: 0.000021 - momentum: 0.000000
2023-10-24 13:13:26,029 epoch 1 - iter 584/738 - loss 0.55818973 - time (sec): 55.64 - samples/sec: 2359.43 - lr: 0.000024 - momentum: 0.000000
2023-10-24 13:13:33,467 epoch 1 - iter 657/738 - loss 0.51160952 - time (sec): 63.08 - samples/sec: 2353.91 - lr: 0.000027 - momentum: 0.000000
2023-10-24 13:13:39,918 epoch 1 - iter 730/738 - loss 0.47724150 - time (sec): 69.53 - samples/sec: 2357.48 - lr: 0.000030 - momentum: 0.000000
2023-10-24 13:13:40,938 ----------------------------------------------------------------------------------------------------
2023-10-24 13:13:40,938 EPOCH 1 done: loss 0.4728 - lr: 0.000030
2023-10-24 13:13:47,154 DEV : loss 0.10554392635822296 - f1-score (micro avg) 0.7528
2023-10-24 13:13:47,175 saving best model
2023-10-24 13:13:47,725 ----------------------------------------------------------------------------------------------------
2023-10-24 13:13:54,266 epoch 2 - iter 73/738 - loss 0.13369869 - time (sec): 6.54 - samples/sec: 2400.65 - lr: 0.000030 - momentum: 0.000000
2023-10-24 13:14:01,134 epoch 2 - iter 146/738 - loss 0.13182120 - time (sec): 13.41 - samples/sec: 2353.28 - lr: 0.000029 - momentum: 0.000000
2023-10-24 13:14:07,956 epoch 2 - iter 219/738 - loss 0.13189987 - time (sec): 20.23 - samples/sec: 2362.32 - lr: 0.000029 - momentum: 0.000000
2023-10-24 13:14:14,726 epoch 2 - iter 292/738 - loss 0.12551263 - time (sec): 27.00 - samples/sec: 2339.53 - lr: 0.000029 - momentum: 0.000000
2023-10-24 13:14:21,473 epoch 2 - iter 365/738 - loss 0.12297520 - time (sec): 33.75 - samples/sec: 2346.93 - lr: 0.000028 - momentum: 0.000000
2023-10-24 13:14:28,306 epoch 2 - iter 438/738 - loss 0.12082661 - time (sec): 40.58 - samples/sec: 2341.88 - lr: 0.000028 - momentum: 0.000000
2023-10-24 13:14:35,634 epoch 2 - iter 511/738 - loss 0.12078839 - time (sec): 47.91 - samples/sec: 2359.93 - lr: 0.000028 - momentum: 0.000000
2023-10-24 13:14:43,365 epoch 2 - iter 584/738 - loss 0.11688290 - time (sec): 55.64 - samples/sec: 2357.67 - lr: 0.000027 - momentum: 0.000000
2023-10-24 13:14:50,105 epoch 2 - iter 657/738 - loss 0.11634259 - time (sec): 62.38 - samples/sec: 2356.32 - lr: 0.000027 - momentum: 0.000000
2023-10-24 13:14:57,762 epoch 2 - iter 730/738 - loss 0.11522082 - time (sec): 70.04 - samples/sec: 2349.93 - lr: 0.000027 - momentum: 0.000000
2023-10-24 13:14:58,512 ----------------------------------------------------------------------------------------------------
2023-10-24 13:14:58,512 EPOCH 2 done: loss 0.1151 - lr: 0.000027
2023-10-24 13:15:07,005 DEV : loss 0.09679369628429413 - f1-score (micro avg) 0.7871
2023-10-24 13:15:07,026 saving best model
2023-10-24 13:15:07,769 ----------------------------------------------------------------------------------------------------
2023-10-24 13:15:13,874 epoch 3 - iter 73/738 - loss 0.06119096 - time (sec): 6.10 - samples/sec: 2527.55 - lr: 0.000026 - momentum: 0.000000
2023-10-24 13:15:21,101 epoch 3 - iter 146/738 - loss 0.06385970 - time (sec): 13.33 - samples/sec: 2408.60 - lr: 0.000026 - momentum: 0.000000
2023-10-24 13:15:28,695 epoch 3 - iter 219/738 - loss 0.06608306 - time (sec): 20.93 - samples/sec: 2350.06 - lr: 0.000026 - momentum: 0.000000
2023-10-24 13:15:36,070 epoch 3 - iter 292/738 - loss 0.06260299 - time (sec): 28.30 - samples/sec: 2348.85 - lr: 0.000025 - momentum: 0.000000
2023-10-24 13:15:43,242 epoch 3 - iter 365/738 - loss 0.06272309 - time (sec): 35.47 - samples/sec: 2337.28 - lr: 0.000025 - momentum: 0.000000
2023-10-24 13:15:50,403 epoch 3 - iter 438/738 - loss 0.06424382 - time (sec): 42.63 - samples/sec: 2337.43 - lr: 0.000025 - momentum: 0.000000
2023-10-24 13:15:57,263 epoch 3 - iter 511/738 - loss 0.06412265 - time (sec): 49.49 - samples/sec: 2339.66 - lr: 0.000024 - momentum: 0.000000
2023-10-24 13:16:03,619 epoch 3 - iter 584/738 - loss 0.06489306 - time (sec): 55.85 - samples/sec: 2349.80 - lr: 0.000024 - momentum: 0.000000
2023-10-24 13:16:10,250 epoch 3 - iter 657/738 - loss 0.06455833 - time (sec): 62.48 - samples/sec: 2347.50 - lr: 0.000024 - momentum: 0.000000
2023-10-24 13:16:17,422 epoch 3 - iter 730/738 - loss 0.06554966 - time (sec): 69.65 - samples/sec: 2355.53 - lr: 0.000023 - momentum: 0.000000
2023-10-24 13:16:18,577 ----------------------------------------------------------------------------------------------------
2023-10-24 13:16:18,577 EPOCH 3 done: loss 0.0656 - lr: 0.000023
2023-10-24 13:16:27,094 DEV : loss 0.1195509284734726 - f1-score (micro avg) 0.8074
2023-10-24 13:16:27,115 saving best model
2023-10-24 13:16:27,857 ----------------------------------------------------------------------------------------------------
2023-10-24 13:16:34,341 epoch 4 - iter 73/738 - loss 0.03826326 - time (sec): 6.48 - samples/sec: 2323.82 - lr: 0.000023 - momentum: 0.000000
2023-10-24 13:16:40,745 epoch 4 - iter 146/738 - loss 0.03920578 - time (sec): 12.89 - samples/sec: 2352.56 - lr: 0.000023 - momentum: 0.000000
2023-10-24 13:16:47,380 epoch 4 - iter 219/738 - loss 0.04269634 - time (sec): 19.52 - samples/sec: 2347.37 - lr: 0.000022 - momentum: 0.000000
2023-10-24 13:16:53,782 epoch 4 - iter 292/738 - loss 0.03941502 - time (sec): 25.92 - samples/sec: 2350.87 - lr: 0.000022 - momentum: 0.000000
2023-10-24 13:17:01,518 epoch 4 - iter 365/738 - loss 0.04336450 - time (sec): 33.66 - samples/sec: 2339.39 - lr: 0.000022 - momentum: 0.000000
2023-10-24 13:17:09,372 epoch 4 - iter 438/738 - loss 0.04463978 - time (sec): 41.51 - samples/sec: 2330.02 - lr: 0.000021 - momentum: 0.000000
2023-10-24 13:17:17,052 epoch 4 - iter 511/738 - loss 0.04335848 - time (sec): 49.19 - samples/sec: 2333.73 - lr: 0.000021 - momentum: 0.000000
2023-10-24 13:17:24,654 epoch 4 - iter 584/738 - loss 0.04416947 - time (sec): 56.80 - samples/sec: 2342.98 - lr: 0.000021 - momentum: 0.000000
2023-10-24 13:17:31,897 epoch 4 - iter 657/738 - loss 0.04422596 - time (sec): 64.04 - samples/sec: 2338.84 - lr: 0.000020 - momentum: 0.000000
2023-10-24 13:17:38,254 epoch 4 - iter 730/738 - loss 0.04340328 - time (sec): 70.40 - samples/sec: 2341.64 - lr: 0.000020 - momentum: 0.000000
2023-10-24 13:17:38,892 ----------------------------------------------------------------------------------------------------
2023-10-24 13:17:38,893 EPOCH 4 done: loss 0.0434 - lr: 0.000020
2023-10-24 13:17:47,395 DEV : loss 0.14306315779685974 - f1-score (micro avg) 0.8255
2023-10-24 13:17:47,416 saving best model
2023-10-24 13:17:48,112 ----------------------------------------------------------------------------------------------------
2023-10-24 13:17:54,842 epoch 5 - iter 73/738 - loss 0.03252486 - time (sec): 6.73 - samples/sec: 2414.59 - lr: 0.000020 - momentum: 0.000000
2023-10-24 13:18:02,107 epoch 5 - iter 146/738 - loss 0.02695551 - time (sec): 13.99 - samples/sec: 2423.52 - lr: 0.000019 - momentum: 0.000000
2023-10-24 13:18:09,085 epoch 5 - iter 219/738 - loss 0.02469070 - time (sec): 20.97 - samples/sec: 2355.55 - lr: 0.000019 - momentum: 0.000000
2023-10-24 13:18:15,961 epoch 5 - iter 292/738 - loss 0.02799215 - time (sec): 27.85 - samples/sec: 2360.35 - lr: 0.000019 - momentum: 0.000000
2023-10-24 13:18:23,556 epoch 5 - iter 365/738 - loss 0.03089862 - time (sec): 35.44 - samples/sec: 2369.17 - lr: 0.000018 - momentum: 0.000000
2023-10-24 13:18:30,272 epoch 5 - iter 438/738 - loss 0.02992995 - time (sec): 42.16 - samples/sec: 2370.04 - lr: 0.000018 - momentum: 0.000000
2023-10-24 13:18:36,758 epoch 5 - iter 511/738 - loss 0.03020870 - time (sec): 48.65 - samples/sec: 2361.43 - lr: 0.000018 - momentum: 0.000000
2023-10-24 13:18:44,656 epoch 5 - iter 584/738 - loss 0.02894484 - time (sec): 56.54 - samples/sec: 2341.48 - lr: 0.000017 - momentum: 0.000000
2023-10-24 13:18:51,234 epoch 5 - iter 657/738 - loss 0.02917966 - time (sec): 63.12 - samples/sec: 2354.37 - lr: 0.000017 - momentum: 0.000000
2023-10-24 13:18:58,515 epoch 5 - iter 730/738 - loss 0.02891546 - time (sec): 70.40 - samples/sec: 2342.45 - lr: 0.000017 - momentum: 0.000000
2023-10-24 13:18:59,256 ----------------------------------------------------------------------------------------------------
2023-10-24 13:18:59,256 EPOCH 5 done: loss 0.0290 - lr: 0.000017
2023-10-24 13:19:07,778 DEV : loss 0.17278100550174713 - f1-score (micro avg) 0.8353
2023-10-24 13:19:07,800 saving best model
2023-10-24 13:19:08,554 ----------------------------------------------------------------------------------------------------
2023-10-24 13:19:15,847 epoch 6 - iter 73/738 - loss 0.02161928 - time (sec): 7.29 - samples/sec: 2358.73 - lr: 0.000016 - momentum: 0.000000
2023-10-24 13:19:21,872 epoch 6 - iter 146/738 - loss 0.02705650 - time (sec): 13.32 - samples/sec: 2391.93 - lr: 0.000016 - momentum: 0.000000
2023-10-24 13:19:28,868 epoch 6 - iter 219/738 - loss 0.02310291 - time (sec): 20.31 - samples/sec: 2372.71 - lr: 0.000016 - momentum: 0.000000
2023-10-24 13:19:36,807 epoch 6 - iter 292/738 - loss 0.02454477 - time (sec): 28.25 - samples/sec: 2397.80 - lr: 0.000015 - momentum: 0.000000
2023-10-24 13:19:43,311 epoch 6 - iter 365/738 - loss 0.02313850 - time (sec): 34.76 - samples/sec: 2387.53 - lr: 0.000015 - momentum: 0.000000
2023-10-24 13:19:49,707 epoch 6 - iter 438/738 - loss 0.02240292 - time (sec): 41.15 - samples/sec: 2378.88 - lr: 0.000015 - momentum: 0.000000
2023-10-24 13:19:55,818 epoch 6 - iter 511/738 - loss 0.02311644 - time (sec): 47.26 - samples/sec: 2369.82 - lr: 0.000014 - momentum: 0.000000
2023-10-24 13:20:02,974 epoch 6 - iter 584/738 - loss 0.02322732 - time (sec): 54.42 - samples/sec: 2367.98 - lr: 0.000014 - momentum: 0.000000
2023-10-24 13:20:10,794 epoch 6 - iter 657/738 - loss 0.02288665 - time (sec): 62.24 - samples/sec: 2368.39 - lr: 0.000014 - momentum: 0.000000
2023-10-24 13:20:18,200 epoch 6 - iter 730/738 - loss 0.02245972 - time (sec): 69.65 - samples/sec: 2364.74 - lr: 0.000013 - momentum: 0.000000
2023-10-24 13:20:18,854 ----------------------------------------------------------------------------------------------------
2023-10-24 13:20:18,855 EPOCH 6 done: loss 0.0223 - lr: 0.000013
2023-10-24 13:20:27,379 DEV : loss 0.1752229779958725 - f1-score (micro avg) 0.8311
2023-10-24 13:20:27,400 ----------------------------------------------------------------------------------------------------
2023-10-24 13:20:34,992 epoch 7 - iter 73/738 - loss 0.01696402 - time (sec): 7.59 - samples/sec: 2506.55 - lr: 0.000013 - momentum: 0.000000
2023-10-24 13:20:42,483 epoch 7 - iter 146/738 - loss 0.01606754 - time (sec): 15.08 - samples/sec: 2405.65 - lr: 0.000013 - momentum: 0.000000
2023-10-24 13:20:49,251 epoch 7 - iter 219/738 - loss 0.01446198 - time (sec): 21.85 - samples/sec: 2366.51 - lr: 0.000012 - momentum: 0.000000
2023-10-24 13:20:56,316 epoch 7 - iter 292/738 - loss 0.01432059 - time (sec): 28.91 - samples/sec: 2354.26 - lr: 0.000012 - momentum: 0.000000
2023-10-24 13:21:02,785 epoch 7 - iter 365/738 - loss 0.01467452 - time (sec): 35.38 - samples/sec: 2363.45 - lr: 0.000012 - momentum: 0.000000
2023-10-24 13:21:09,515 epoch 7 - iter 438/738 - loss 0.01528636 - time (sec): 42.11 - samples/sec: 2356.64 - lr: 0.000011 - momentum: 0.000000
2023-10-24 13:21:16,247 epoch 7 - iter 511/738 - loss 0.01555352 - time (sec): 48.85 - samples/sec: 2347.07 - lr: 0.000011 - momentum: 0.000000
2023-10-24 13:21:22,550 epoch 7 - iter 584/738 - loss 0.01567911 - time (sec): 55.15 - samples/sec: 2345.54 - lr: 0.000011 - momentum: 0.000000
2023-10-24 13:21:30,682 epoch 7 - iter 657/738 - loss 0.01546853 - time (sec): 63.28 - samples/sec: 2347.97 - lr: 0.000010 - momentum: 0.000000
2023-10-24 13:21:37,808 epoch 7 - iter 730/738 - loss 0.01529696 - time (sec): 70.41 - samples/sec: 2337.42 - lr: 0.000010 - momentum: 0.000000
2023-10-24 13:21:38,478 ----------------------------------------------------------------------------------------------------
2023-10-24 13:21:38,478 EPOCH 7 done: loss 0.0154 - lr: 0.000010
2023-10-24 13:21:47,007 DEV : loss 0.18594373762607574 - f1-score (micro avg) 0.8365
2023-10-24 13:21:47,028 saving best model
2023-10-24 13:21:47,720 ----------------------------------------------------------------------------------------------------
2023-10-24 13:21:54,425 epoch 8 - iter 73/738 - loss 0.00750537 - time (sec): 6.70 - samples/sec: 2238.98 - lr: 0.000010 - momentum: 0.000000
2023-10-24 13:22:01,617 epoch 8 - iter 146/738 - loss 0.00788359 - time (sec): 13.90 - samples/sec: 2269.45 - lr: 0.000009 - momentum: 0.000000
2023-10-24 13:22:08,824 epoch 8 - iter 219/738 - loss 0.00934315 - time (sec): 21.10 - samples/sec: 2322.77 - lr: 0.000009 - momentum: 0.000000
2023-10-24 13:22:16,377 epoch 8 - iter 292/738 - loss 0.01466951 - time (sec): 28.66 - samples/sec: 2371.90 - lr: 0.000009 - momentum: 0.000000
2023-10-24 13:22:22,773 epoch 8 - iter 365/738 - loss 0.01332356 - time (sec): 35.05 - samples/sec: 2373.74 - lr: 0.000008 - momentum: 0.000000
2023-10-24 13:22:30,133 epoch 8 - iter 438/738 - loss 0.01270781 - time (sec): 42.41 - samples/sec: 2367.04 - lr: 0.000008 - momentum: 0.000000
2023-10-24 13:22:36,550 epoch 8 - iter 511/738 - loss 0.01165258 - time (sec): 48.83 - samples/sec: 2364.33 - lr: 0.000008 - momentum: 0.000000
2023-10-24 13:22:43,352 epoch 8 - iter 584/738 - loss 0.01144850 - time (sec): 55.63 - samples/sec: 2364.87 - lr: 0.000007 - momentum: 0.000000
2023-10-24 13:22:50,962 epoch 8 - iter 657/738 - loss 0.01115597 - time (sec): 63.24 - samples/sec: 2359.28 - lr: 0.000007 - momentum: 0.000000
2023-10-24 13:22:57,811 epoch 8 - iter 730/738 - loss 0.01096523 - time (sec): 70.09 - samples/sec: 2347.47 - lr: 0.000007 - momentum: 0.000000
2023-10-24 13:22:58,511 ----------------------------------------------------------------------------------------------------
2023-10-24 13:22:58,512 EPOCH 8 done: loss 0.0109 - lr: 0.000007
2023-10-24 13:23:07,041 DEV : loss 0.20139646530151367 - f1-score (micro avg) 0.8427
2023-10-24 13:23:07,063 saving best model
2023-10-24 13:23:07,765 ----------------------------------------------------------------------------------------------------
2023-10-24 13:23:14,724 epoch 9 - iter 73/738 - loss 0.00257277 - time (sec): 6.96 - samples/sec: 2324.26 - lr: 0.000006 - momentum: 0.000000
2023-10-24 13:23:23,010 epoch 9 - iter 146/738 - loss 0.00730412 - time (sec): 15.24 - samples/sec: 2403.85 - lr: 0.000006 - momentum: 0.000000
2023-10-24 13:23:29,423 epoch 9 - iter 219/738 - loss 0.00609698 - time (sec): 21.66 - samples/sec: 2410.15 - lr: 0.000006 - momentum: 0.000000
2023-10-24 13:23:35,741 epoch 9 - iter 292/738 - loss 0.00544285 - time (sec): 27.98 - samples/sec: 2421.83 - lr: 0.000005 - momentum: 0.000000
2023-10-24 13:23:42,329 epoch 9 - iter 365/738 - loss 0.00635157 - time (sec): 34.56 - samples/sec: 2393.52 - lr: 0.000005 - momentum: 0.000000
2023-10-24 13:23:49,427 epoch 9 - iter 438/738 - loss 0.00672352 - time (sec): 41.66 - samples/sec: 2379.81 - lr: 0.000005 - momentum: 0.000000
2023-10-24 13:23:56,025 epoch 9 - iter 511/738 - loss 0.00663039 - time (sec): 48.26 - samples/sec: 2380.14 - lr: 0.000004 - momentum: 0.000000
2023-10-24 13:24:03,205 epoch 9 - iter 584/738 - loss 0.00714230 - time (sec): 55.44 - samples/sec: 2372.22 - lr: 0.000004 - momentum: 0.000000
2023-10-24 13:24:10,544 epoch 9 - iter 657/738 - loss 0.00737085 - time (sec): 62.78 - samples/sec: 2369.24 - lr: 0.000004 - momentum: 0.000000
2023-10-24 13:24:17,784 epoch 9 - iter 730/738 - loss 0.00770982 - time (sec): 70.02 - samples/sec: 2355.80 - lr: 0.000003 - momentum: 0.000000
2023-10-24 13:24:18,512 ----------------------------------------------------------------------------------------------------
2023-10-24 13:24:18,513 EPOCH 9 done: loss 0.0077 - lr: 0.000003
2023-10-24 13:24:27,038 DEV : loss 0.21205534040927887 - f1-score (micro avg) 0.8366
2023-10-24 13:24:27,060 ----------------------------------------------------------------------------------------------------
2023-10-24 13:24:34,370 epoch 10 - iter 73/738 - loss 0.00087160 - time (sec): 7.31 - samples/sec: 2298.01 - lr: 0.000003 - momentum: 0.000000
2023-10-24 13:24:40,795 epoch 10 - iter 146/738 - loss 0.00199352 - time (sec): 13.73 - samples/sec: 2346.39 - lr: 0.000003 - momentum: 0.000000
2023-10-24 13:24:47,446 epoch 10 - iter 219/738 - loss 0.00286730 - time (sec): 20.39 - samples/sec: 2358.58 - lr: 0.000002 - momentum: 0.000000
2023-10-24 13:24:54,218 epoch 10 - iter 292/738 - loss 0.00350713 - time (sec): 27.16 - samples/sec: 2358.90 - lr: 0.000002 - momentum: 0.000000
2023-10-24 13:25:01,059 epoch 10 - iter 365/738 - loss 0.00344795 - time (sec): 34.00 - samples/sec: 2340.08 - lr: 0.000002 - momentum: 0.000000
2023-10-24 13:25:07,969 epoch 10 - iter 438/738 - loss 0.00386173 - time (sec): 40.91 - samples/sec: 2319.16 - lr: 0.000001 - momentum: 0.000000
2023-10-24 13:25:14,691 epoch 10 - iter 511/738 - loss 0.00390650 - time (sec): 47.63 - samples/sec: 2328.71 - lr: 0.000001 - momentum: 0.000000
2023-10-24 13:25:21,256 epoch 10 - iter 584/738 - loss 0.00497431 - time (sec): 54.20 - samples/sec: 2330.43 - lr: 0.000001 - momentum: 0.000000
2023-10-24 13:25:28,442 epoch 10 - iter 657/738 - loss 0.00532116 - time (sec): 61.38 - samples/sec: 2357.16 - lr: 0.000000 - momentum: 0.000000
2023-10-24 13:25:36,894 epoch 10 - iter 730/738 - loss 0.00617810 - time (sec): 69.83 - samples/sec: 2357.54 - lr: 0.000000 - momentum: 0.000000
2023-10-24 13:25:37,571 ----------------------------------------------------------------------------------------------------
2023-10-24 13:25:37,572 EPOCH 10 done: loss 0.0061 - lr: 0.000000
2023-10-24 13:25:46,103 DEV : loss 0.2109983116388321 - f1-score (micro avg) 0.8403
2023-10-24 13:25:46,684 ----------------------------------------------------------------------------------------------------
2023-10-24 13:25:46,685 Loading model from best epoch ...
2023-10-24 13:25:48,551 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod
2023-10-24 13:25:55,248
Results:
- F-score (micro) 0.7894
- F-score (macro) 0.6916
- Accuracy 0.6747
By class:
precision recall f1-score support
loc 0.8341 0.8846 0.8586 858
pers 0.7371 0.7989 0.7668 537
org 0.5547 0.5758 0.5651 132
time 0.5077 0.6111 0.5546 54
prod 0.7593 0.6721 0.7130 61
micro avg 0.7654 0.8149 0.7894 1642
macro avg 0.6786 0.7085 0.6916 1642
weighted avg 0.7664 0.8149 0.7896 1642
2023-10-24 13:25:55,249 ----------------------------------------------------------------------------------------------------