File size: 713 Bytes
b90e2a6 36d6c19 180e27a d21c1de 233804b 8c42b15 baa5549 5a35f31 725419d |
1 2 3 4 5 6 7 8 9 10 |
Started at: 13:54:30 ({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'afb829e3d0b861bd5f8cda6522b32ca0b097d7eb'}, {}) Epoch: 0 Training loss: 0.19169998451283105 - MSE: 0.3224822171590252 Validation loss : 0.1545075431931764 - MSE: 0.30456596715475825 Epoch: 1 Training loss: 0.16974398517294934 - MSE: 0.3103326546437019 Validation loss : 0.15334105514921248 - MSE: 0.3032539253831601 Epoch: 2 |