snousias's picture
update model card README.md
aacf213
|
raw
history blame
3.82 kB
metadata
tags:
  - generated_from_trainer
model-index:
  - name: bert-base-greek-uncased-v2-finetuned-polylex
    results: []

bert-base-greek-uncased-v2-finetuned-polylex

This model is a fine-tuned version of nlpaueb/bert-base-greek-uncased-v1 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7748

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
0.9392 1.0 12 2.6722
0.8066 2.0 24 2.5322
0.6438 3.0 36 2.2449
0.5654 4.0 48 2.2614
0.6796 5.0 60 2.7160
0.6361 6.0 72 3.0292
0.6515 7.0 84 3.2517
0.6073 8.0 96 2.7854
0.5889 9.0 108 2.4421
0.7134 10.0 120 2.6351
0.3772 11.0 132 2.6377
0.5095 12.0 144 2.5834
0.3965 13.0 156 2.8454
0.4555 14.0 168 2.2274
0.4788 15.0 180 2.2452
0.543 16.0 192 2.4528
0.4249 17.0 204 3.1464
0.5451 18.0 216 2.9913
0.4661 19.0 228 2.6519
0.3383 20.0 240 2.9366
0.3598 21.0 252 3.2501
0.5232 22.0 264 2.3395
0.3792 23.0 276 2.8389
0.4436 24.0 288 2.7843
0.3975 25.0 300 2.3773
0.3268 26.0 312 4.0139
0.4764 27.0 324 3.7974
0.3743 28.0 336 2.0727
0.4574 29.0 348 2.5576
0.5219 30.0 360 3.3557
0.3854 31.0 372 2.4598
0.4107 32.0 384 2.8564
0.3899 33.0 396 2.4589
0.3655 34.0 408 2.6613
0.4607 35.0 420 2.7836
0.3703 36.0 432 2.9499
0.455 37.0 444 2.9653
0.4234 38.0 456 1.7769
0.4161 39.0 468 2.9451
0.3983 40.0 480 2.6283
0.5074 41.0 492 2.8233
0.4793 42.0 504 2.4598
0.5614 43.0 516 3.3149
0.4965 44.0 528 2.0932
0.5946 45.0 540 2.6992
0.5001 46.0 552 2.7653
0.6891 47.0 564 2.6126
0.8634 48.0 576 1.9130
0.639 49.0 588 2.7710
0.572 50.0 600 3.3767

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3