File size: 1,003 Bytes
cd0c00d f3414cc 2636c88 ae0b851 9e00ab7 c5cbf99 7e2dd58 361f8e4 b60daf7 4f7afa5 3378eca b7bee43 a85546b b31d85d 6932deb 8b4f30c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
Started at: 13:13:42 norbert2, 1e-06, 128 ({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {}) Epoch: 0 Training loss: 0.4574310235001824 - MAE: 0.5401737275380348 Validation loss : 0.35341861844062805 - MAE: 0.47187966068458054 Epoch: 1 Training loss: 0.3368046994913708 - MAE: 0.45609539357970186 Validation loss : 0.2778044169819033 - MAE: 0.40998613461344674 Epoch: 2 Training loss: 0.27038083604790947 - MAE: 0.4062241308124049 Validation loss : 0.2211051559126055 - MAE: 0.36538234947589154 Epoch: 3 Training loss: 0.22609539072621954 - MAE: 0.3727795396986739 Validation loss : 0.19445732558095777 - MAE: 0.34284239765259944 Epoch: 4 |