roberta-large-ner-qlorafinetune-runs-colab

This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the biobert_json dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0688
  • Precision: 0.9390
  • Recall: 0.9598
  • F1: 0.9493
  • Accuracy: 0.9821

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use paged_adamw_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 1300
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
2.4626 0.0654 20 0.9421 0.4829 0.1165 0.1877 0.7544
0.7547 0.1307 40 0.3993 0.7557 0.6814 0.7166 0.8940
0.4022 0.1961 60 0.2119 0.8276 0.8158 0.8217 0.9396
0.2732 0.2614 80 0.1631 0.8250 0.8746 0.8491 0.9512
0.2083 0.3268 100 0.1423 0.8591 0.9037 0.8808 0.9576
0.2216 0.3922 120 0.1392 0.8562 0.9147 0.8845 0.9572
0.1787 0.4575 140 0.1114 0.8940 0.9173 0.9055 0.9664
0.1642 0.5229 160 0.1191 0.8840 0.9270 0.9050 0.9657
0.1557 0.5882 180 0.1089 0.8825 0.9284 0.9049 0.9665
0.1406 0.6536 200 0.0982 0.8967 0.9279 0.9121 0.9700
0.1359 0.7190 220 0.0879 0.9182 0.9269 0.9225 0.9733
0.1272 0.7843 240 0.1047 0.8940 0.9506 0.9214 0.9697
0.1157 0.8497 260 0.0985 0.9198 0.9266 0.9232 0.9719
0.1191 0.9150 280 0.1166 0.8827 0.9427 0.9117 0.9656
0.1298 0.9804 300 0.0878 0.9211 0.9315 0.9263 0.9736
0.1107 1.0458 320 0.0834 0.9205 0.9512 0.9356 0.9762
0.0942 1.1111 340 0.0874 0.9097 0.9574 0.9329 0.9745
0.0979 1.1765 360 0.0771 0.9259 0.9518 0.9387 0.9779
0.0971 1.2418 380 0.0814 0.9280 0.9478 0.9378 0.9781
0.1053 1.3072 400 0.0804 0.9214 0.9399 0.9306 0.9761
0.1075 1.3725 420 0.0835 0.9083 0.9369 0.9224 0.9738
0.0893 1.4379 440 0.0773 0.9329 0.9469 0.9398 0.9784
0.09 1.5033 460 0.0737 0.9316 0.9522 0.9418 0.9787
0.0947 1.5686 480 0.0787 0.9141 0.9549 0.9340 0.9763
0.0907 1.6340 500 0.0813 0.9179 0.9522 0.9347 0.9770
0.0752 1.6993 520 0.0802 0.9130 0.9575 0.9347 0.9772
0.0801 1.7647 540 0.0703 0.9302 0.9530 0.9415 0.9797
0.092 1.8301 560 0.0739 0.9301 0.9513 0.9406 0.9785
0.0862 1.8954 580 0.0899 0.9034 0.9526 0.9274 0.9735
0.0869 1.9608 600 0.0782 0.9164 0.9510 0.9334 0.9765
0.0713 2.0261 620 0.0771 0.9225 0.9579 0.9399 0.9785
0.0635 2.0915 640 0.0729 0.9356 0.9524 0.9439 0.9797
0.0527 2.1569 660 0.0764 0.9088 0.9475 0.9277 0.9765
0.0738 2.2222 680 0.0747 0.9233 0.9576 0.9401 0.9783
0.0628 2.2876 700 0.0751 0.9334 0.9589 0.9460 0.9801
0.0574 2.3529 720 0.0713 0.9354 0.9580 0.9465 0.9807
0.0628 2.4183 740 0.0700 0.9347 0.9540 0.9443 0.9809
0.0771 2.4837 760 0.0707 0.9326 0.9607 0.9465 0.9811
0.068 2.5490 780 0.0753 0.9318 0.9648 0.9480 0.9807
0.0653 2.6144 800 0.0680 0.9400 0.9583 0.9491 0.9820
0.0567 2.6797 820 0.0762 0.9327 0.9540 0.9433 0.9791
0.066 2.7451 840 0.0719 0.9297 0.9570 0.9431 0.9805
0.0576 2.8105 860 0.0723 0.9360 0.9597 0.9477 0.9808
0.0608 2.8758 880 0.0744 0.9309 0.9566 0.9436 0.9791
0.0521 2.9412 900 0.0679 0.9355 0.9599 0.9475 0.9814
0.051 3.0065 920 0.0688 0.9373 0.9594 0.9482 0.9818
0.0444 3.0719 940 0.0723 0.9335 0.9607 0.9469 0.9814
0.0468 3.1373 960 0.0767 0.9246 0.9554 0.9397 0.9787
0.0433 3.2026 980 0.0681 0.9376 0.9591 0.9482 0.9819
0.0468 3.2680 1000 0.0722 0.9318 0.9589 0.9452 0.9808
0.0496 3.3333 1020 0.0708 0.9341 0.9496 0.9418 0.9803
0.0473 3.3987 1040 0.0699 0.9315 0.9666 0.9487 0.9819
0.0534 3.4641 1060 0.0675 0.9368 0.9569 0.9468 0.9819
0.0421 3.5294 1080 0.0698 0.9322 0.9564 0.9442 0.9809
0.0444 3.5948 1100 0.0715 0.9303 0.9539 0.9420 0.9799
0.0366 3.6601 1120 0.0671 0.9382 0.9615 0.9497 0.9823
0.0505 3.7255 1140 0.0687 0.9376 0.9554 0.9464 0.9814
0.0431 3.7908 1160 0.0698 0.9338 0.9594 0.9465 0.9813
0.0519 3.8562 1180 0.0696 0.9378 0.9604 0.9490 0.9820
0.0471 3.9216 1200 0.0712 0.9380 0.9599 0.9488 0.9817
0.0544 3.9869 1220 0.0688 0.9407 0.9588 0.9497 0.9819
0.0392 4.0523 1240 0.0688 0.9389 0.9599 0.9493 0.9822
0.0303 4.1176 1260 0.0698 0.9376 0.9601 0.9487 0.9817
0.0383 4.1830 1280 0.0689 0.9393 0.9605 0.9498 0.9821
0.0389 4.2484 1300 0.0688 0.9390 0.9598 0.9493 0.9821

Framework versions

  • PEFT 0.14.0
  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
38
Safetensors
Model size
559M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for raulgdp/roberta-large-ner-qlorafinetune-runs-colab

Adapter
(22)
this model