UIT-NO-PREUIT-xlnet-large-cased-finetuned-finetuned

This model is a fine-tuned version of sercetexam9/UIT-xlnet-large-cased-finetuned on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0463
  • F1: 0.7122
  • Roc Auc: 0.7806
  • Accuracy: 0.4711

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
0.0528 1.0 139 0.8260 0.6956 0.7780 0.4260
0.0697 2.0 278 0.7774 0.6760 0.7605 0.4097
0.0662 3.0 417 0.7810 0.6889 0.7644 0.4043
0.0539 4.0 556 0.8407 0.7021 0.7861 0.4224
0.0355 5.0 695 0.8692 0.6723 0.7466 0.3845
0.0633 6.0 834 0.8634 0.6629 0.7517 0.4007
0.0493 7.0 973 0.8450 0.6750 0.7581 0.4350
0.0397 8.0 1112 0.9122 0.6912 0.7759 0.4224
0.0321 9.0 1251 1.0101 0.6669 0.7618 0.4061
0.0208 10.0 1390 0.9714 0.6922 0.7682 0.4296
0.0131 11.0 1529 0.9408 0.6907 0.7711 0.4097
0.0121 12.0 1668 0.9753 0.6966 0.7695 0.4332
0.0057 13.0 1807 0.9916 0.6816 0.7642 0.4188
0.0055 14.0 1946 0.9780 0.6983 0.7734 0.4495
0.0121 15.0 2085 0.9855 0.7038 0.7826 0.4440
0.0097 16.0 2224 1.0266 0.6979 0.7733 0.4386
0.0073 17.0 2363 1.0192 0.6991 0.7749 0.4422
0.0021 18.0 2502 1.0567 0.6971 0.7740 0.4332
0.0053 19.0 2641 1.0234 0.7062 0.7778 0.4549
0.0007 20.0 2780 1.0558 0.6921 0.7672 0.4368
0.0006 21.0 2919 1.0601 0.7005 0.7759 0.4422
0.0021 22.0 3058 1.0578 0.7076 0.7817 0.4422
0.0005 23.0 3197 1.0613 0.7031 0.7749 0.4495
0.0013 24.0 3336 1.0467 0.7111 0.7812 0.4657
0.0006 25.0 3475 1.0463 0.7122 0.7806 0.4711
0.0006 26.0 3614 1.0443 0.7076 0.7791 0.4639
0.0005 27.0 3753 1.0476 0.7077 0.7792 0.4657
0.0005 28.0 3892 1.0490 0.7080 0.7791 0.4639
0.0005 29.0 4031 1.0487 0.7073 0.7784 0.4621
0.0005 30.0 4170 1.0485 0.7070 0.7781 0.4621

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.21.0
Downloads last month
13
Safetensors
Model size
361M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for sercetexam9/UIT-NO-PREUIT-xlnet-large-cased-finetuned-finetuned

Finetuned
(1)
this model