File size: 791 Bytes
a4a2f82 cfc770f 9c992a6 28bf9f1 75dcb42 21a5baf eb54f30 3246e3a e48c1ef b6961bd 22b1dea |
1 2 3 4 5 6 7 8 9 10 11 12 |
Started at: 06:49:33 norbert2, 0.001, 256 ({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {}) Epoch: 0 Training loss: 0.31302871882915495 - MAE: 0.43550123211776 Validation loss : 0.19499443140294817 - MAE: 0.3375275454882718 Epoch: 1 Training loss: 0.20615242600440978 - MAE: 0.34713061837107256 Validation loss : 0.1856598754723867 - MAE: 0.3336331175439254 Epoch: 2 Training loss: 0.1905288130044937 - MAE: 0.331755109482026 |