File size: 578 Bytes
c80765a
57c5f87
fe885cb
c3db1af
fb9d943
a96bb15
1
2
3
4
5
6
7
Started at: 11:52:44
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'afb829e3d0b861bd5f8cda6522b32ca0b097d7eb'}, {})
Epoch: 0
Training loss: 0.19169998451283105 - MSE: 0.3224822171590252
Validation loss : 0.1576854281593114 - MSE: 0.3052585763866773
Epoch: 1