stefan-it's picture
Upload ./training.log with huggingface_hub
634349c
2023-10-24 17:38:12,465 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,466 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(1): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(2): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(3): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(4): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(5): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(6): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(7): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(8): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(9): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(10): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(11): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-24 17:38:12,466 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 MultiCorpus: 7936 train + 992 dev + 992 test sentences
- NER_ICDAR_EUROPEANA Corpus: 7936 train + 992 dev + 992 test sentences - /home/ubuntu/.flair/datasets/ner_icdar_europeana/fr
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Train: 7936 sentences
2023-10-24 17:38:12,467 (train_with_dev=False, train_with_test=False)
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Training Params:
2023-10-24 17:38:12,467 - learning_rate: "3e-05"
2023-10-24 17:38:12,467 - mini_batch_size: "8"
2023-10-24 17:38:12,467 - max_epochs: "10"
2023-10-24 17:38:12,467 - shuffle: "True"
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Plugins:
2023-10-24 17:38:12,467 - TensorboardLogger
2023-10-24 17:38:12,467 - LinearScheduler | warmup_fraction: '0.1'
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Final evaluation on model from best epoch (best-model.pt)
2023-10-24 17:38:12,467 - metric: "('micro avg', 'f1-score')"
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Computation:
2023-10-24 17:38:12,467 - compute on device: cuda:0
2023-10-24 17:38:12,467 - embedding storage: none
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Model training base path: "hmbench-icdar/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 ----------------------------------------------------------------------------------------------------
2023-10-24 17:38:12,467 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-24 17:38:20,974 epoch 1 - iter 99/992 - loss 1.74525625 - time (sec): 8.51 - samples/sec: 2051.05 - lr: 0.000003 - momentum: 0.000000
2023-10-24 17:38:29,111 epoch 1 - iter 198/992 - loss 1.08863551 - time (sec): 16.64 - samples/sec: 2023.06 - lr: 0.000006 - momentum: 0.000000
2023-10-24 17:38:37,169 epoch 1 - iter 297/992 - loss 0.81797787 - time (sec): 24.70 - samples/sec: 1987.76 - lr: 0.000009 - momentum: 0.000000
2023-10-24 17:38:45,550 epoch 1 - iter 396/992 - loss 0.65812865 - time (sec): 33.08 - samples/sec: 1983.87 - lr: 0.000012 - momentum: 0.000000
2023-10-24 17:38:53,652 epoch 1 - iter 495/992 - loss 0.56334832 - time (sec): 41.18 - samples/sec: 1974.74 - lr: 0.000015 - momentum: 0.000000
2023-10-24 17:39:01,801 epoch 1 - iter 594/992 - loss 0.49615806 - time (sec): 49.33 - samples/sec: 1968.64 - lr: 0.000018 - momentum: 0.000000
2023-10-24 17:39:10,421 epoch 1 - iter 693/992 - loss 0.43998124 - time (sec): 57.95 - samples/sec: 1965.03 - lr: 0.000021 - momentum: 0.000000
2023-10-24 17:39:18,883 epoch 1 - iter 792/992 - loss 0.39963914 - time (sec): 66.41 - samples/sec: 1961.65 - lr: 0.000024 - momentum: 0.000000
2023-10-24 17:39:27,283 epoch 1 - iter 891/992 - loss 0.37071134 - time (sec): 74.82 - samples/sec: 1968.84 - lr: 0.000027 - momentum: 0.000000
2023-10-24 17:39:35,696 epoch 1 - iter 990/992 - loss 0.34738778 - time (sec): 83.23 - samples/sec: 1965.86 - lr: 0.000030 - momentum: 0.000000
2023-10-24 17:39:35,876 ----------------------------------------------------------------------------------------------------
2023-10-24 17:39:35,876 EPOCH 1 done: loss 0.3469 - lr: 0.000030
2023-10-24 17:39:38,923 DEV : loss 0.09140600264072418 - f1-score (micro avg) 0.7223
2023-10-24 17:39:38,938 saving best model
2023-10-24 17:39:39,407 ----------------------------------------------------------------------------------------------------
2023-10-24 17:39:47,548 epoch 2 - iter 99/992 - loss 0.09680147 - time (sec): 8.14 - samples/sec: 2002.81 - lr: 0.000030 - momentum: 0.000000
2023-10-24 17:39:55,858 epoch 2 - iter 198/992 - loss 0.09397801 - time (sec): 16.45 - samples/sec: 1974.74 - lr: 0.000029 - momentum: 0.000000
2023-10-24 17:40:04,370 epoch 2 - iter 297/992 - loss 0.09651916 - time (sec): 24.96 - samples/sec: 1957.39 - lr: 0.000029 - momentum: 0.000000
2023-10-24 17:40:12,929 epoch 2 - iter 396/992 - loss 0.09944484 - time (sec): 33.52 - samples/sec: 1957.00 - lr: 0.000029 - momentum: 0.000000
2023-10-24 17:40:21,301 epoch 2 - iter 495/992 - loss 0.09736309 - time (sec): 41.89 - samples/sec: 1967.03 - lr: 0.000028 - momentum: 0.000000
2023-10-24 17:40:29,549 epoch 2 - iter 594/992 - loss 0.09763263 - time (sec): 50.14 - samples/sec: 1968.89 - lr: 0.000028 - momentum: 0.000000
2023-10-24 17:40:38,016 epoch 2 - iter 693/992 - loss 0.09689609 - time (sec): 58.61 - samples/sec: 1971.33 - lr: 0.000028 - momentum: 0.000000
2023-10-24 17:40:46,350 epoch 2 - iter 792/992 - loss 0.09551141 - time (sec): 66.94 - samples/sec: 1960.12 - lr: 0.000027 - momentum: 0.000000
2023-10-24 17:40:54,691 epoch 2 - iter 891/992 - loss 0.09603742 - time (sec): 75.28 - samples/sec: 1955.14 - lr: 0.000027 - momentum: 0.000000
2023-10-24 17:41:03,174 epoch 2 - iter 990/992 - loss 0.09714532 - time (sec): 83.77 - samples/sec: 1954.54 - lr: 0.000027 - momentum: 0.000000
2023-10-24 17:41:03,320 ----------------------------------------------------------------------------------------------------
2023-10-24 17:41:03,320 EPOCH 2 done: loss 0.0971 - lr: 0.000027
2023-10-24 17:41:06,418 DEV : loss 0.08006458729505539 - f1-score (micro avg) 0.753
2023-10-24 17:41:06,433 saving best model
2023-10-24 17:41:07,035 ----------------------------------------------------------------------------------------------------
2023-10-24 17:41:15,236 epoch 3 - iter 99/992 - loss 0.06139682 - time (sec): 8.20 - samples/sec: 1973.44 - lr: 0.000026 - momentum: 0.000000
2023-10-24 17:41:23,724 epoch 3 - iter 198/992 - loss 0.06703090 - time (sec): 16.69 - samples/sec: 1974.09 - lr: 0.000026 - momentum: 0.000000
2023-10-24 17:41:31,896 epoch 3 - iter 297/992 - loss 0.07066772 - time (sec): 24.86 - samples/sec: 1965.86 - lr: 0.000026 - momentum: 0.000000
2023-10-24 17:41:39,983 epoch 3 - iter 396/992 - loss 0.06888834 - time (sec): 32.95 - samples/sec: 1966.40 - lr: 0.000025 - momentum: 0.000000
2023-10-24 17:41:48,706 epoch 3 - iter 495/992 - loss 0.06680337 - time (sec): 41.67 - samples/sec: 1977.56 - lr: 0.000025 - momentum: 0.000000
2023-10-24 17:41:57,109 epoch 3 - iter 594/992 - loss 0.06827427 - time (sec): 50.07 - samples/sec: 1972.82 - lr: 0.000025 - momentum: 0.000000
2023-10-24 17:42:05,256 epoch 3 - iter 693/992 - loss 0.06841488 - time (sec): 58.22 - samples/sec: 1970.99 - lr: 0.000024 - momentum: 0.000000
2023-10-24 17:42:13,637 epoch 3 - iter 792/992 - loss 0.06732059 - time (sec): 66.60 - samples/sec: 1972.16 - lr: 0.000024 - momentum: 0.000000
2023-10-24 17:42:22,045 epoch 3 - iter 891/992 - loss 0.06623660 - time (sec): 75.01 - samples/sec: 1970.81 - lr: 0.000024 - momentum: 0.000000
2023-10-24 17:42:30,186 epoch 3 - iter 990/992 - loss 0.06618561 - time (sec): 83.15 - samples/sec: 1969.27 - lr: 0.000023 - momentum: 0.000000
2023-10-24 17:42:30,341 ----------------------------------------------------------------------------------------------------
2023-10-24 17:42:30,341 EPOCH 3 done: loss 0.0661 - lr: 0.000023
2023-10-24 17:42:33,454 DEV : loss 0.09770625084638596 - f1-score (micro avg) 0.7664
2023-10-24 17:42:33,469 saving best model
2023-10-24 17:42:34,044 ----------------------------------------------------------------------------------------------------
2023-10-24 17:42:42,469 epoch 4 - iter 99/992 - loss 0.04066720 - time (sec): 8.42 - samples/sec: 1876.58 - lr: 0.000023 - momentum: 0.000000
2023-10-24 17:42:51,202 epoch 4 - iter 198/992 - loss 0.04839658 - time (sec): 17.16 - samples/sec: 1910.97 - lr: 0.000023 - momentum: 0.000000
2023-10-24 17:42:59,645 epoch 4 - iter 297/992 - loss 0.04694213 - time (sec): 25.60 - samples/sec: 1923.25 - lr: 0.000022 - momentum: 0.000000
2023-10-24 17:43:07,788 epoch 4 - iter 396/992 - loss 0.04727140 - time (sec): 33.74 - samples/sec: 1930.44 - lr: 0.000022 - momentum: 0.000000
2023-10-24 17:43:15,990 epoch 4 - iter 495/992 - loss 0.04760580 - time (sec): 41.95 - samples/sec: 1945.69 - lr: 0.000022 - momentum: 0.000000
2023-10-24 17:43:23,619 epoch 4 - iter 594/992 - loss 0.04584277 - time (sec): 49.57 - samples/sec: 1943.44 - lr: 0.000021 - momentum: 0.000000
2023-10-24 17:43:32,148 epoch 4 - iter 693/992 - loss 0.04678695 - time (sec): 58.10 - samples/sec: 1952.99 - lr: 0.000021 - momentum: 0.000000
2023-10-24 17:43:40,536 epoch 4 - iter 792/992 - loss 0.04681106 - time (sec): 66.49 - samples/sec: 1951.01 - lr: 0.000021 - momentum: 0.000000
2023-10-24 17:43:48,623 epoch 4 - iter 891/992 - loss 0.04740672 - time (sec): 74.58 - samples/sec: 1961.24 - lr: 0.000020 - momentum: 0.000000
2023-10-24 17:43:57,554 epoch 4 - iter 990/992 - loss 0.04725284 - time (sec): 83.51 - samples/sec: 1959.72 - lr: 0.000020 - momentum: 0.000000
2023-10-24 17:43:57,705 ----------------------------------------------------------------------------------------------------
2023-10-24 17:43:57,705 EPOCH 4 done: loss 0.0472 - lr: 0.000020
2023-10-24 17:44:00,820 DEV : loss 0.1370622217655182 - f1-score (micro avg) 0.7684
2023-10-24 17:44:00,835 saving best model
2023-10-24 17:44:01,508 ----------------------------------------------------------------------------------------------------
2023-10-24 17:44:10,012 epoch 5 - iter 99/992 - loss 0.02951998 - time (sec): 8.50 - samples/sec: 1998.04 - lr: 0.000020 - momentum: 0.000000
2023-10-24 17:44:18,292 epoch 5 - iter 198/992 - loss 0.03452251 - time (sec): 16.78 - samples/sec: 1967.20 - lr: 0.000019 - momentum: 0.000000
2023-10-24 17:44:26,827 epoch 5 - iter 297/992 - loss 0.03477441 - time (sec): 25.32 - samples/sec: 1958.56 - lr: 0.000019 - momentum: 0.000000
2023-10-24 17:44:35,053 epoch 5 - iter 396/992 - loss 0.03442328 - time (sec): 33.54 - samples/sec: 1945.10 - lr: 0.000019 - momentum: 0.000000
2023-10-24 17:44:43,298 epoch 5 - iter 495/992 - loss 0.03659471 - time (sec): 41.79 - samples/sec: 1960.49 - lr: 0.000018 - momentum: 0.000000
2023-10-24 17:44:51,314 epoch 5 - iter 594/992 - loss 0.03569550 - time (sec): 49.80 - samples/sec: 1964.23 - lr: 0.000018 - momentum: 0.000000
2023-10-24 17:45:00,024 epoch 5 - iter 693/992 - loss 0.03585171 - time (sec): 58.51 - samples/sec: 1961.05 - lr: 0.000018 - momentum: 0.000000
2023-10-24 17:45:08,378 epoch 5 - iter 792/992 - loss 0.03741357 - time (sec): 66.87 - samples/sec: 1960.60 - lr: 0.000017 - momentum: 0.000000
2023-10-24 17:45:16,464 epoch 5 - iter 891/992 - loss 0.03829687 - time (sec): 74.96 - samples/sec: 1960.70 - lr: 0.000017 - momentum: 0.000000
2023-10-24 17:45:24,955 epoch 5 - iter 990/992 - loss 0.03723549 - time (sec): 83.45 - samples/sec: 1961.01 - lr: 0.000017 - momentum: 0.000000
2023-10-24 17:45:25,122 ----------------------------------------------------------------------------------------------------
2023-10-24 17:45:25,122 EPOCH 5 done: loss 0.0372 - lr: 0.000017
2023-10-24 17:45:28,550 DEV : loss 0.16291803121566772 - f1-score (micro avg) 0.7765
2023-10-24 17:45:28,566 saving best model
2023-10-24 17:45:29,156 ----------------------------------------------------------------------------------------------------
2023-10-24 17:45:37,741 epoch 6 - iter 99/992 - loss 0.02033797 - time (sec): 8.58 - samples/sec: 1891.50 - lr: 0.000016 - momentum: 0.000000
2023-10-24 17:45:46,162 epoch 6 - iter 198/992 - loss 0.02093798 - time (sec): 17.01 - samples/sec: 1941.89 - lr: 0.000016 - momentum: 0.000000
2023-10-24 17:45:54,439 epoch 6 - iter 297/992 - loss 0.02177729 - time (sec): 25.28 - samples/sec: 1960.95 - lr: 0.000016 - momentum: 0.000000
2023-10-24 17:46:02,552 epoch 6 - iter 396/992 - loss 0.02426201 - time (sec): 33.40 - samples/sec: 1971.88 - lr: 0.000015 - momentum: 0.000000
2023-10-24 17:46:11,069 epoch 6 - iter 495/992 - loss 0.02614459 - time (sec): 41.91 - samples/sec: 1972.42 - lr: 0.000015 - momentum: 0.000000
2023-10-24 17:46:19,360 epoch 6 - iter 594/992 - loss 0.02643793 - time (sec): 50.20 - samples/sec: 1964.61 - lr: 0.000015 - momentum: 0.000000
2023-10-24 17:46:27,558 epoch 6 - iter 693/992 - loss 0.02736029 - time (sec): 58.40 - samples/sec: 1959.09 - lr: 0.000014 - momentum: 0.000000
2023-10-24 17:46:35,923 epoch 6 - iter 792/992 - loss 0.02667042 - time (sec): 66.77 - samples/sec: 1958.07 - lr: 0.000014 - momentum: 0.000000
2023-10-24 17:46:44,180 epoch 6 - iter 891/992 - loss 0.02785199 - time (sec): 75.02 - samples/sec: 1949.51 - lr: 0.000014 - momentum: 0.000000
2023-10-24 17:46:52,411 epoch 6 - iter 990/992 - loss 0.02840540 - time (sec): 83.25 - samples/sec: 1966.07 - lr: 0.000013 - momentum: 0.000000
2023-10-24 17:46:52,574 ----------------------------------------------------------------------------------------------------
2023-10-24 17:46:52,574 EPOCH 6 done: loss 0.0284 - lr: 0.000013
2023-10-24 17:46:55,693 DEV : loss 0.1790854036808014 - f1-score (micro avg) 0.7681
2023-10-24 17:46:55,708 ----------------------------------------------------------------------------------------------------
2023-10-24 17:47:04,452 epoch 7 - iter 99/992 - loss 0.02418540 - time (sec): 8.74 - samples/sec: 1919.47 - lr: 0.000013 - momentum: 0.000000
2023-10-24 17:47:12,542 epoch 7 - iter 198/992 - loss 0.02557997 - time (sec): 16.83 - samples/sec: 1928.44 - lr: 0.000013 - momentum: 0.000000
2023-10-24 17:47:20,873 epoch 7 - iter 297/992 - loss 0.02329917 - time (sec): 25.16 - samples/sec: 1936.67 - lr: 0.000012 - momentum: 0.000000
2023-10-24 17:47:29,301 epoch 7 - iter 396/992 - loss 0.02067675 - time (sec): 33.59 - samples/sec: 1918.15 - lr: 0.000012 - momentum: 0.000000
2023-10-24 17:47:37,478 epoch 7 - iter 495/992 - loss 0.02044117 - time (sec): 41.77 - samples/sec: 1924.13 - lr: 0.000012 - momentum: 0.000000
2023-10-24 17:47:46,176 epoch 7 - iter 594/992 - loss 0.02024929 - time (sec): 50.47 - samples/sec: 1938.10 - lr: 0.000011 - momentum: 0.000000
2023-10-24 17:47:54,710 epoch 7 - iter 693/992 - loss 0.01948114 - time (sec): 59.00 - samples/sec: 1944.74 - lr: 0.000011 - momentum: 0.000000
2023-10-24 17:48:02,922 epoch 7 - iter 792/992 - loss 0.01956172 - time (sec): 67.21 - samples/sec: 1949.30 - lr: 0.000011 - momentum: 0.000000
2023-10-24 17:48:11,022 epoch 7 - iter 891/992 - loss 0.01970571 - time (sec): 75.31 - samples/sec: 1956.65 - lr: 0.000010 - momentum: 0.000000
2023-10-24 17:48:19,161 epoch 7 - iter 990/992 - loss 0.02041426 - time (sec): 83.45 - samples/sec: 1959.32 - lr: 0.000010 - momentum: 0.000000
2023-10-24 17:48:19,336 ----------------------------------------------------------------------------------------------------
2023-10-24 17:48:19,336 EPOCH 7 done: loss 0.0204 - lr: 0.000010
2023-10-24 17:48:22,769 DEV : loss 0.20467530190944672 - f1-score (micro avg) 0.7616
2023-10-24 17:48:22,785 ----------------------------------------------------------------------------------------------------
2023-10-24 17:48:31,373 epoch 8 - iter 99/992 - loss 0.01338429 - time (sec): 8.59 - samples/sec: 2020.71 - lr: 0.000010 - momentum: 0.000000
2023-10-24 17:48:40,061 epoch 8 - iter 198/992 - loss 0.01222624 - time (sec): 17.27 - samples/sec: 1977.55 - lr: 0.000009 - momentum: 0.000000
2023-10-24 17:48:48,204 epoch 8 - iter 297/992 - loss 0.01238322 - time (sec): 25.42 - samples/sec: 1955.48 - lr: 0.000009 - momentum: 0.000000
2023-10-24 17:48:56,609 epoch 8 - iter 396/992 - loss 0.01378072 - time (sec): 33.82 - samples/sec: 1945.84 - lr: 0.000009 - momentum: 0.000000
2023-10-24 17:49:04,670 epoch 8 - iter 495/992 - loss 0.01464499 - time (sec): 41.88 - samples/sec: 1950.49 - lr: 0.000008 - momentum: 0.000000
2023-10-24 17:49:13,145 epoch 8 - iter 594/992 - loss 0.01529863 - time (sec): 50.36 - samples/sec: 1962.49 - lr: 0.000008 - momentum: 0.000000
2023-10-24 17:49:21,465 epoch 8 - iter 693/992 - loss 0.01426628 - time (sec): 58.68 - samples/sec: 1964.81 - lr: 0.000008 - momentum: 0.000000
2023-10-24 17:49:29,282 epoch 8 - iter 792/992 - loss 0.01434806 - time (sec): 66.50 - samples/sec: 1961.40 - lr: 0.000007 - momentum: 0.000000
2023-10-24 17:49:37,726 epoch 8 - iter 891/992 - loss 0.01444871 - time (sec): 74.94 - samples/sec: 1960.50 - lr: 0.000007 - momentum: 0.000000
2023-10-24 17:49:46,097 epoch 8 - iter 990/992 - loss 0.01488712 - time (sec): 83.31 - samples/sec: 1964.08 - lr: 0.000007 - momentum: 0.000000
2023-10-24 17:49:46,245 ----------------------------------------------------------------------------------------------------
2023-10-24 17:49:46,245 EPOCH 8 done: loss 0.0149 - lr: 0.000007
2023-10-24 17:49:49,369 DEV : loss 0.23477818071842194 - f1-score (micro avg) 0.7571
2023-10-24 17:49:49,384 ----------------------------------------------------------------------------------------------------
2023-10-24 17:49:57,571 epoch 9 - iter 99/992 - loss 0.01371693 - time (sec): 8.19 - samples/sec: 1937.71 - lr: 0.000006 - momentum: 0.000000
2023-10-24 17:50:05,790 epoch 9 - iter 198/992 - loss 0.01100052 - time (sec): 16.40 - samples/sec: 1927.30 - lr: 0.000006 - momentum: 0.000000
2023-10-24 17:50:13,928 epoch 9 - iter 297/992 - loss 0.01149748 - time (sec): 24.54 - samples/sec: 1925.21 - lr: 0.000006 - momentum: 0.000000
2023-10-24 17:50:23,074 epoch 9 - iter 396/992 - loss 0.01212931 - time (sec): 33.69 - samples/sec: 1919.27 - lr: 0.000005 - momentum: 0.000000
2023-10-24 17:50:31,771 epoch 9 - iter 495/992 - loss 0.01093322 - time (sec): 42.39 - samples/sec: 1929.30 - lr: 0.000005 - momentum: 0.000000
2023-10-24 17:50:40,350 epoch 9 - iter 594/992 - loss 0.01065967 - time (sec): 50.97 - samples/sec: 1930.89 - lr: 0.000005 - momentum: 0.000000
2023-10-24 17:50:48,388 epoch 9 - iter 693/992 - loss 0.01078237 - time (sec): 59.00 - samples/sec: 1939.66 - lr: 0.000004 - momentum: 0.000000
2023-10-24 17:50:56,619 epoch 9 - iter 792/992 - loss 0.01024575 - time (sec): 67.23 - samples/sec: 1942.87 - lr: 0.000004 - momentum: 0.000000
2023-10-24 17:51:04,644 epoch 9 - iter 891/992 - loss 0.01058774 - time (sec): 75.26 - samples/sec: 1951.24 - lr: 0.000004 - momentum: 0.000000
2023-10-24 17:51:12,810 epoch 9 - iter 990/992 - loss 0.01091116 - time (sec): 83.43 - samples/sec: 1962.25 - lr: 0.000003 - momentum: 0.000000
2023-10-24 17:51:12,957 ----------------------------------------------------------------------------------------------------
2023-10-24 17:51:12,957 EPOCH 9 done: loss 0.0109 - lr: 0.000003
2023-10-24 17:51:16,412 DEV : loss 0.23431342840194702 - f1-score (micro avg) 0.7708
2023-10-24 17:51:16,427 ----------------------------------------------------------------------------------------------------
2023-10-24 17:51:24,444 epoch 10 - iter 99/992 - loss 0.00607875 - time (sec): 8.02 - samples/sec: 2022.08 - lr: 0.000003 - momentum: 0.000000
2023-10-24 17:51:32,702 epoch 10 - iter 198/992 - loss 0.00533387 - time (sec): 16.27 - samples/sec: 1987.29 - lr: 0.000003 - momentum: 0.000000
2023-10-24 17:51:41,170 epoch 10 - iter 297/992 - loss 0.00635844 - time (sec): 24.74 - samples/sec: 1984.33 - lr: 0.000002 - momentum: 0.000000
2023-10-24 17:51:49,635 epoch 10 - iter 396/992 - loss 0.00826920 - time (sec): 33.21 - samples/sec: 1992.73 - lr: 0.000002 - momentum: 0.000000
2023-10-24 17:51:57,861 epoch 10 - iter 495/992 - loss 0.00834489 - time (sec): 41.43 - samples/sec: 1987.29 - lr: 0.000002 - momentum: 0.000000
2023-10-24 17:52:06,239 epoch 10 - iter 594/992 - loss 0.00777999 - time (sec): 49.81 - samples/sec: 1972.40 - lr: 0.000001 - momentum: 0.000000
2023-10-24 17:52:14,639 epoch 10 - iter 693/992 - loss 0.00812146 - time (sec): 58.21 - samples/sec: 1968.69 - lr: 0.000001 - momentum: 0.000000
2023-10-24 17:52:22,702 epoch 10 - iter 792/992 - loss 0.00755783 - time (sec): 66.27 - samples/sec: 1964.62 - lr: 0.000001 - momentum: 0.000000
2023-10-24 17:52:31,219 epoch 10 - iter 891/992 - loss 0.00784551 - time (sec): 74.79 - samples/sec: 1962.71 - lr: 0.000000 - momentum: 0.000000
2023-10-24 17:52:39,702 epoch 10 - iter 990/992 - loss 0.00776579 - time (sec): 83.27 - samples/sec: 1964.99 - lr: 0.000000 - momentum: 0.000000
2023-10-24 17:52:39,872 ----------------------------------------------------------------------------------------------------
2023-10-24 17:52:39,872 EPOCH 10 done: loss 0.0078 - lr: 0.000000
2023-10-24 17:52:42,994 DEV : loss 0.2393815815448761 - f1-score (micro avg) 0.7619
2023-10-24 17:52:43,482 ----------------------------------------------------------------------------------------------------
2023-10-24 17:52:43,483 Loading model from best epoch ...
2023-10-24 17:52:44,969 SequenceTagger predicts: Dictionary with 13 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG
2023-10-24 17:52:48,050
Results:
- F-score (micro) 0.7854
- F-score (macro) 0.7033
- Accuracy 0.6628
By class:
precision recall f1-score support
LOC 0.8130 0.8565 0.8342 655
PER 0.7377 0.8072 0.7709 223
ORG 0.6386 0.4173 0.5048 127
micro avg 0.7807 0.7900 0.7854 1005
macro avg 0.7298 0.6937 0.7033 1005
weighted avg 0.7743 0.7900 0.7785 1005
2023-10-24 17:52:48,050 ----------------------------------------------------------------------------------------------------