ASAP_FineTuningBERT_AugV8_k7_task1_organization_k7_fold1

This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7989
  • Qwk: 0.4914
  • Mse: 0.7989
  • Rmse: 0.8938

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 1.0 6 5.2573 0.0377 5.2552 2.2924
No log 2.0 12 2.5087 -0.0010 2.5070 1.5833
No log 3.0 18 1.3849 0.0 1.3834 1.1762
No log 4.0 24 1.7612 -0.0038 1.7595 1.3265
No log 5.0 30 1.1366 0.0 1.1351 1.0654
No log 6.0 36 1.1223 0.0442 1.1215 1.0590
No log 7.0 42 1.2324 0.1867 1.2315 1.1097
No log 8.0 48 1.1346 0.2782 1.1336 1.0647
No log 9.0 54 1.3104 0.2176 1.3096 1.1444
No log 10.0 60 1.1616 0.2879 1.1610 1.0775
No log 11.0 66 1.1562 0.3156 1.1560 1.0752
No log 12.0 72 0.8712 0.4681 0.8713 0.9334
No log 13.0 78 0.8865 0.4659 0.8865 0.9415
No log 14.0 84 0.8601 0.4374 0.8600 0.9274
No log 15.0 90 0.8758 0.4471 0.8758 0.9358
No log 16.0 96 0.8737 0.4367 0.8738 0.9348
No log 17.0 102 0.8386 0.4546 0.8387 0.9158
No log 18.0 108 1.0156 0.3793 1.0155 1.0077
No log 19.0 114 1.3565 0.3169 1.3562 1.1646
No log 20.0 120 0.8249 0.4701 0.8249 0.9082
No log 21.0 126 0.9193 0.4403 0.9193 0.9588
No log 22.0 132 0.8230 0.4363 0.8229 0.9072
No log 23.0 138 0.7801 0.4878 0.7800 0.8832
No log 24.0 144 0.7798 0.5001 0.7797 0.8830
No log 25.0 150 0.8299 0.4664 0.8297 0.9109
No log 26.0 156 1.0127 0.3930 1.0125 1.0062
No log 27.0 162 0.8689 0.4114 0.8687 0.9321
No log 28.0 168 0.8844 0.4035 0.8845 0.9405
No log 29.0 174 0.8088 0.4435 0.8088 0.8994
No log 30.0 180 0.9039 0.4212 0.9039 0.9508
No log 31.0 186 0.7896 0.4756 0.7897 0.8886
No log 32.0 192 0.7748 0.4766 0.7748 0.8802
No log 33.0 198 0.8010 0.4585 0.8009 0.8949
No log 34.0 204 0.7989 0.4914 0.7989 0.8938

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for genki10/ASAP_FineTuningBERT_AugV8_k7_task1_organization_k7_fold1

Finetuned
(2655)
this model