Arabic_FineTuningAraBERT_AugV4_k2_task1_organization_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8035
  • Qwk: 0.7614
  • Mse: 0.8035
  • Rmse: 0.8964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 4.6141 0.0181 4.6141 2.1480
No log 0.0597 4 3.2424 0.1257 3.2424 1.8007
No log 0.0896 6 1.8539 0.1168 1.8539 1.3616
No log 0.1194 8 1.5982 0.0486 1.5982 1.2642
No log 0.1493 10 1.6161 0.1209 1.6161 1.2713
No log 0.1791 12 1.5381 0.3082 1.5381 1.2402
No log 0.2090 14 1.4671 0.0399 1.4671 1.2112
No log 0.2388 16 1.7983 0.1873 1.7983 1.3410
No log 0.2687 18 2.2581 0.1075 2.2581 1.5027
No log 0.2985 20 2.1516 0.1075 2.1516 1.4668
No log 0.3284 22 1.8827 0.1873 1.8827 1.3721
No log 0.3582 24 1.7051 0.1873 1.7051 1.3058
No log 0.3881 26 1.5644 0.1873 1.5644 1.2507
No log 0.4179 28 1.4150 0.3762 1.4150 1.1895
No log 0.4478 30 1.3132 0.3527 1.3132 1.1460
No log 0.4776 32 1.1787 0.4576 1.1787 1.0857
No log 0.5075 34 1.1258 0.5093 1.1258 1.0610
No log 0.5373 36 1.1608 0.4842 1.1608 1.0774
No log 0.5672 38 1.2421 0.3494 1.2421 1.1145
No log 0.5970 40 1.3612 0.3478 1.3612 1.1667
No log 0.6269 42 1.4007 0.2687 1.4007 1.1835
No log 0.6567 44 1.3796 0.2687 1.3796 1.1746
No log 0.6866 46 1.3129 0.4535 1.3129 1.1458
No log 0.7164 48 1.2129 0.4044 1.2129 1.1013
No log 0.7463 50 1.1490 0.4590 1.1490 1.0719
No log 0.7761 52 1.1361 0.4535 1.1361 1.0659
No log 0.8060 54 1.2837 0.5542 1.2837 1.1330
No log 0.8358 56 1.2594 0.5542 1.2594 1.1222
No log 0.8657 58 1.1565 0.5984 1.1565 1.0754
No log 0.8955 60 1.1047 0.5542 1.1047 1.0510
No log 0.9254 62 1.0446 0.6624 1.0446 1.0221
No log 0.9552 64 1.0819 0.6624 1.0819 1.0401
No log 0.9851 66 1.1414 0.6845 1.1414 1.0683
No log 1.0149 68 1.2805 0.6763 1.2805 1.1316
No log 1.0448 70 1.1686 0.7515 1.1686 1.0810
No log 1.0746 72 0.9552 0.7080 0.9552 0.9773
No log 1.1045 74 0.8551 0.6410 0.8551 0.9247
No log 1.1343 76 0.8212 0.6510 0.8212 0.9062
No log 1.1642 78 0.8110 0.6510 0.8110 0.9006
No log 1.1940 80 0.8016 0.6396 0.8016 0.8953
No log 1.2239 82 0.8406 0.6882 0.8406 0.9169
No log 1.2537 84 0.9411 0.7063 0.9411 0.9701
No log 1.2836 86 1.0094 0.6757 1.0094 1.0047
No log 1.3134 88 0.9332 0.6757 0.9332 0.9660
No log 1.3433 90 0.9107 0.6757 0.9107 0.9543
No log 1.3731 92 0.8306 0.5882 0.8306 0.9114
No log 1.4030 94 0.7900 0.6921 0.7900 0.8888
No log 1.4328 96 0.8322 0.6712 0.8322 0.9123
No log 1.4627 98 1.0427 0.6757 1.0427 1.0211
No log 1.4925 100 1.2795 0.6866 1.2795 1.1312
No log 1.5224 102 1.3320 0.6441 1.3320 1.1541
No log 1.5522 104 1.2073 0.6982 1.2073 1.0988
No log 1.5821 106 0.9262 0.7063 0.9262 0.9624
No log 1.6119 108 0.7248 0.6703 0.7248 0.8514
No log 1.6418 110 0.6855 0.7277 0.6855 0.8280
No log 1.6716 112 0.7485 0.6759 0.7485 0.8652
No log 1.7015 114 0.8947 0.6992 0.8947 0.9459
No log 1.7313 116 1.0139 0.6992 1.0139 1.0069
No log 1.7612 118 0.9966 0.7612 0.9966 0.9983
No log 1.7910 120 0.9147 0.7063 0.9147 0.9564
No log 1.8209 122 0.7714 0.6992 0.7714 0.8783
No log 1.8507 124 0.6875 0.7073 0.6875 0.8292
No log 1.8806 126 0.7071 0.7149 0.7071 0.8409
No log 1.9104 128 0.8390 0.7612 0.8390 0.9160
No log 1.9403 130 0.9619 0.6818 0.9619 0.9808
No log 1.9701 132 1.0240 0.6982 1.0240 1.0119
No log 2.0 134 1.0367 0.7601 1.0367 1.0182
No log 2.0299 136 0.9862 0.7601 0.9862 0.9931
No log 2.0597 138 0.9492 0.7123 0.9492 0.9743
No log 2.0896 140 0.9293 0.7523 0.9293 0.9640
No log 2.1194 142 0.8585 0.7985 0.8585 0.9265
No log 2.1493 144 0.8739 0.7894 0.8739 0.9348
No log 2.1791 146 0.9510 0.7601 0.9510 0.9752
No log 2.2090 148 0.9507 0.7354 0.9507 0.9751
No log 2.2388 150 0.8913 0.7433 0.8913 0.9441
No log 2.2687 152 0.9035 0.7517 0.9035 0.9505
No log 2.2985 154 0.8831 0.7517 0.8831 0.9397
No log 2.3284 156 0.8678 0.7433 0.8678 0.9316
No log 2.3582 158 0.8162 0.7983 0.8162 0.9035
No log 2.3881 160 0.7834 0.7983 0.7834 0.8851
No log 2.4179 162 0.7562 0.7779 0.7562 0.8696
No log 2.4478 164 0.7177 0.7779 0.7177 0.8471
No log 2.4776 166 0.7559 0.7779 0.7559 0.8694
No log 2.5075 168 0.9339 0.7607 0.9339 0.9664
No log 2.5373 170 1.1579 0.7250 1.1579 1.0760
No log 2.5672 172 1.1443 0.7178 1.1443 1.0697
No log 2.5970 174 0.9985 0.7204 0.9985 0.9993
No log 2.6269 176 0.8501 0.7060 0.8501 0.9220
No log 2.6567 178 0.7345 0.6970 0.7345 0.8570
No log 2.6866 180 0.7597 0.6970 0.7597 0.8716
No log 2.7164 182 0.8065 0.7196 0.8065 0.8980
No log 2.7463 184 0.8272 0.7521 0.8272 0.9095
No log 2.7761 186 0.8631 0.7612 0.8631 0.9291
No log 2.8060 188 0.8523 0.8094 0.8523 0.9232
No log 2.8358 190 0.8575 0.8094 0.8575 0.9260
No log 2.8657 192 0.8525 0.7983 0.8525 0.9233
No log 2.8955 194 0.8410 0.6927 0.8410 0.9171
No log 2.9254 196 0.8023 0.7447 0.8023 0.8957
No log 2.9552 198 0.7679 0.7373 0.7679 0.8763
No log 2.9851 200 0.8055 0.7373 0.8055 0.8975
No log 3.0149 202 0.9306 0.7375 0.9306 0.9647
No log 3.0448 204 1.1061 0.7758 1.1061 1.0517
No log 3.0746 206 1.0472 0.7635 1.0472 1.0233
No log 3.1045 208 0.9185 0.7375 0.9185 0.9584
No log 3.1343 210 0.8040 0.7523 0.8040 0.8966
No log 3.1642 212 0.7543 0.7373 0.7543 0.8685
No log 3.1940 214 0.7613 0.7835 0.7613 0.8725
No log 3.2239 216 0.7851 0.7689 0.7851 0.8860
No log 3.2537 218 0.7374 0.7689 0.7374 0.8587
No log 3.2836 220 0.6983 0.7196 0.6983 0.8356
No log 3.3134 222 0.7391 0.7196 0.7391 0.8597
No log 3.3433 224 0.8816 0.7433 0.8816 0.9389
No log 3.3731 226 0.9701 0.7354 0.9701 0.9849
No log 3.4030 228 1.0310 0.7431 1.0310 1.0154
No log 3.4328 230 0.9998 0.7514 0.9998 0.9999
No log 3.4627 232 0.9059 0.7354 0.9059 0.9518
No log 3.4925 234 0.8493 0.7346 0.8493 0.9216
No log 3.5224 236 0.7104 0.7529 0.7104 0.8429
No log 3.5522 238 0.6200 0.7204 0.6200 0.7874
No log 3.5821 240 0.5954 0.7455 0.5954 0.7716
No log 3.6119 242 0.6442 0.7686 0.6442 0.8026
No log 3.6418 244 0.8006 0.7128 0.8006 0.8947
No log 3.6716 246 1.0466 0.7358 1.0466 1.0230
No log 3.7015 248 1.0666 0.7358 1.0666 1.0327
No log 3.7313 250 0.9426 0.7811 0.9426 0.9709
No log 3.7612 252 0.8149 0.7593 0.8149 0.9027
No log 3.7910 254 0.7625 0.7373 0.7625 0.8732
No log 3.8209 256 0.7331 0.7373 0.7331 0.8562
No log 3.8507 258 0.7155 0.7373 0.7155 0.8458
No log 3.8806 260 0.7488 0.7835 0.7488 0.8653
No log 3.9104 262 0.8623 0.7897 0.8623 0.9286
No log 3.9403 264 0.8829 0.7675 0.8829 0.9396
No log 3.9701 266 0.8559 0.7605 0.8559 0.9252
No log 4.0 268 0.8250 0.7605 0.8250 0.9083
No log 4.0299 270 0.7715 0.7689 0.7715 0.8784
No log 4.0597 272 0.7576 0.7128 0.7576 0.8704
No log 4.0896 274 0.8065 0.7605 0.8065 0.8980
No log 4.1194 276 0.8368 0.7605 0.8368 0.9148
No log 4.1493 278 0.8294 0.7605 0.8294 0.9107
No log 4.1791 280 0.7993 0.7373 0.7993 0.8940
No log 4.2090 282 0.7621 0.7373 0.7621 0.8730
No log 4.2388 284 0.7985 0.7373 0.7985 0.8936
No log 4.2687 286 0.8208 0.7373 0.8208 0.9060
No log 4.2985 288 0.8737 0.73 0.8737 0.9347
No log 4.3284 290 1.0149 0.7223 1.0149 1.0074
No log 4.3582 292 1.1228 0.6983 1.1228 1.0596
No log 4.3881 294 1.0583 0.6583 1.0583 1.0288
No log 4.4179 296 0.8796 0.7601 0.8796 0.9379
No log 4.4478 298 0.7737 0.7689 0.7737 0.8796
No log 4.4776 300 0.6973 0.7686 0.6973 0.8350
No log 4.5075 302 0.7109 0.7304 0.7109 0.8431
No log 4.5373 304 0.8146 0.7752 0.8146 0.9025
No log 4.5672 306 1.0021 0.7377 1.0021 1.0010
No log 4.5970 308 1.0048 0.7808 1.0048 1.0024
No log 4.6269 310 0.8434 0.7451 0.8434 0.9184
No log 4.6567 312 0.6969 0.7451 0.6969 0.8348
No log 4.6866 314 0.6632 0.7451 0.6632 0.8144
No log 4.7164 316 0.6694 0.7451 0.6694 0.8182
No log 4.7463 318 0.6944 0.7451 0.6944 0.8333
No log 4.7761 320 0.8018 0.7273 0.8018 0.8954
No log 4.8060 322 0.8995 0.7351 0.8995 0.9484
No log 4.8358 324 0.9285 0.6922 0.9285 0.9636
No log 4.8657 326 0.8397 0.7351 0.8397 0.9164
No log 4.8955 328 0.7497 0.7351 0.7497 0.8658
No log 4.9254 330 0.7659 0.7351 0.7659 0.8752
No log 4.9552 332 0.8826 0.7437 0.8826 0.9395
No log 4.9851 334 0.9476 0.7602 0.9476 0.9734
No log 5.0149 336 0.9545 0.7602 0.9545 0.9770
No log 5.0448 338 0.8556 0.7437 0.8556 0.9250
No log 5.0746 340 0.6860 0.7752 0.6860 0.8283
No log 5.1045 342 0.5993 0.7686 0.5993 0.7742
No log 5.1343 344 0.5986 0.7686 0.5986 0.7737
No log 5.1642 346 0.6613 0.8147 0.6613 0.8132
No log 5.1940 348 0.8159 0.7437 0.8159 0.9033
No log 5.2239 350 0.9212 0.6871 0.9212 0.9598
No log 5.2537 352 0.9345 0.6871 0.9345 0.9667
No log 5.2836 354 0.8600 0.6871 0.8600 0.9274
No log 5.3134 356 0.7214 0.8352 0.7214 0.8494
No log 5.3433 358 0.6429 0.8352 0.6429 0.8018
No log 5.3731 360 0.6128 0.8246 0.6128 0.7828
No log 5.4030 362 0.6446 0.8352 0.6446 0.8029
No log 5.4328 364 0.7029 0.8352 0.7029 0.8384
No log 5.4627 366 0.7551 0.8352 0.7551 0.8689
No log 5.4925 368 0.7559 0.8352 0.7559 0.8694
No log 5.5224 370 0.7940 0.7924 0.7940 0.8911
No log 5.5522 372 0.7699 0.8352 0.7699 0.8774
No log 5.5821 374 0.7199 0.8246 0.7199 0.8485
No log 5.6119 376 0.6982 0.8246 0.6982 0.8356
No log 5.6418 378 0.7158 0.8352 0.7158 0.8460
No log 5.6716 380 0.6793 0.7859 0.6793 0.8242
No log 5.7015 382 0.6710 0.7614 0.6710 0.8192
No log 5.7313 384 0.6920 0.7614 0.6920 0.8319
No log 5.7612 386 0.7525 0.7614 0.7525 0.8675
No log 5.7910 388 0.8459 0.7689 0.8459 0.9197
No log 5.8209 390 0.9068 0.7689 0.9068 0.9523
No log 5.8507 392 0.8668 0.7521 0.8668 0.9310
No log 5.8806 394 0.8394 0.7521 0.8394 0.9162
No log 5.9104 396 0.7792 0.7779 0.7791 0.8827
No log 5.9403 398 0.7683 0.7689 0.7683 0.8765
No log 5.9701 400 0.7469 0.7689 0.7469 0.8642
No log 6.0 402 0.7912 0.7924 0.7912 0.8895
No log 6.0299 404 0.8182 0.8440 0.8182 0.9045
No log 6.0597 406 0.8008 0.8105 0.8008 0.8949
No log 6.0896 408 0.7396 0.7596 0.7396 0.8600
No log 6.1194 410 0.6967 0.7686 0.6967 0.8347
No log 6.1493 412 0.6696 0.7686 0.6696 0.8183
No log 6.1791 414 0.6779 0.7686 0.6779 0.8234
No log 6.2090 416 0.7549 0.7614 0.7549 0.8689
No log 6.2388 418 0.8394 0.7879 0.8394 0.9162
No log 6.2687 420 0.8967 0.7713 0.8967 0.9470
No log 6.2985 422 0.8500 0.7713 0.8500 0.9220
No log 6.3284 424 0.7618 0.7351 0.7618 0.8728
No log 6.3582 426 0.6943 0.7614 0.6943 0.8333
No log 6.3881 428 0.6992 0.7614 0.6992 0.8362
No log 6.4179 430 0.7594 0.7351 0.7594 0.8714
No log 6.4478 432 0.8103 0.7879 0.8103 0.9002
No log 6.4776 434 0.9089 0.7521 0.9089 0.9534
No log 6.5075 436 1.0095 0.7157 1.0095 1.0048
No log 6.5373 438 1.0230 0.6866 1.0230 1.0114
No log 6.5672 440 0.9536 0.7239 0.9536 0.9765
No log 6.5970 442 0.8524 0.7713 0.8524 0.9232
No log 6.6269 444 0.7599 0.7983 0.7599 0.8717
No log 6.6567 446 0.6987 0.7435 0.6987 0.8359
No log 6.6866 448 0.7005 0.7614 0.7005 0.8370
No log 6.7164 450 0.7502 0.7614 0.7502 0.8662
No log 6.7463 452 0.7772 0.7614 0.7772 0.8816
No log 6.7761 454 0.7919 0.7614 0.7919 0.8899
No log 6.8060 456 0.7717 0.7614 0.7717 0.8785
No log 6.8358 458 0.7092 0.7614 0.7092 0.8422
No log 6.8657 460 0.6389 0.7614 0.6389 0.7993
No log 6.8955 462 0.5938 0.7686 0.5938 0.7706
No log 6.9254 464 0.5799 0.7686 0.5799 0.7615
No log 6.9552 466 0.5890 0.7686 0.5890 0.7674
No log 6.9851 468 0.6271 0.7614 0.6271 0.7919
No log 7.0149 470 0.6996 0.7614 0.6996 0.8364
No log 7.0448 472 0.7756 0.7614 0.7756 0.8807
No log 7.0746 474 0.8431 0.7704 0.8431 0.9182
No log 7.1045 476 0.8905 0.7948 0.8905 0.9437
No log 7.1343 478 0.9122 0.7696 0.9122 0.9551
No log 7.1642 480 0.8894 0.7948 0.8894 0.9431
No log 7.1940 482 0.8255 0.8123 0.8255 0.9086
No log 7.2239 484 0.7485 0.8123 0.7485 0.8652
No log 7.2537 486 0.7321 0.8123 0.7321 0.8556
No log 7.2836 488 0.7326 0.8022 0.7326 0.8559
No log 7.3134 490 0.7622 0.8123 0.7622 0.8731
No log 7.3433 492 0.7654 0.7614 0.7654 0.8749
No log 7.3731 494 0.7709 0.7196 0.7709 0.8780
No log 7.4030 496 0.8246 0.7689 0.8246 0.9081
No log 7.4328 498 0.8931 0.7689 0.8931 0.9450
0.4451 7.4627 500 0.9022 0.7779 0.9022 0.9498
0.4451 7.4925 502 0.8542 0.7196 0.8542 0.9242
0.4451 7.5224 504 0.7896 0.7196 0.7896 0.8886
0.4451 7.5522 506 0.7246 0.7614 0.7246 0.8512
0.4451 7.5821 508 0.6883 0.7614 0.6883 0.8296
0.4451 7.6119 510 0.6925 0.7614 0.6925 0.8322
0.4451 7.6418 512 0.7290 0.7614 0.7290 0.8538
0.4451 7.6716 514 0.7928 0.7196 0.7928 0.8904
0.4451 7.7015 516 0.8300 0.7196 0.8300 0.9110
0.4451 7.7313 518 0.8187 0.7196 0.8187 0.9048
0.4451 7.7612 520 0.8095 0.7196 0.8095 0.8997
0.4451 7.7910 522 0.8012 0.7196 0.8012 0.8951
0.4451 7.8209 524 0.7985 0.7196 0.7985 0.8936
0.4451 7.8507 526 0.7823 0.7196 0.7823 0.8845
0.4451 7.8806 528 0.7694 0.7196 0.7694 0.8772
0.4451 7.9104 530 0.7661 0.7614 0.7661 0.8753
0.4451 7.9403 532 0.7803 0.7704 0.7803 0.8834
0.4451 7.9701 534 0.8191 0.7704 0.8191 0.9050
0.4451 8.0 536 0.8722 0.7167 0.8722 0.9339
0.4451 8.0299 538 0.9025 0.7696 0.9025 0.9500
0.4451 8.0597 540 0.9049 0.7696 0.9049 0.9513
0.4451 8.0896 542 0.8687 0.7167 0.8687 0.9321
0.4451 8.1194 544 0.8372 0.6992 0.8372 0.9150
0.4451 8.1493 546 0.7863 0.7435 0.7863 0.8868
0.4451 8.1791 548 0.7469 0.7704 0.7469 0.8642
0.4451 8.2090 550 0.7342 0.7614 0.7342 0.8569
0.4451 8.2388 552 0.7212 0.7529 0.7212 0.8492
0.4451 8.2687 554 0.7234 0.7529 0.7234 0.8506
0.4451 8.2985 556 0.7277 0.7614 0.7277 0.8530
0.4451 8.3284 558 0.7437 0.7614 0.7437 0.8624
0.4451 8.3582 560 0.7735 0.7614 0.7735 0.8795
0.4451 8.3881 562 0.8150 0.7196 0.8150 0.9028
0.4451 8.4179 564 0.8727 0.7196 0.8727 0.9342
0.4451 8.4478 566 0.9049 0.7601 0.9049 0.9513
0.4451 8.4776 568 0.9001 0.7601 0.9001 0.9487
0.4451 8.5075 570 0.8673 0.7601 0.8673 0.9313
0.4451 8.5373 572 0.8229 0.7601 0.8229 0.9071
0.4451 8.5672 574 0.7746 0.7835 0.7746 0.8801
0.4451 8.5970 576 0.7276 0.8246 0.7276 0.8530
0.4451 8.6269 578 0.6905 0.7769 0.6905 0.8309
0.4451 8.6567 580 0.6798 0.7769 0.6798 0.8245
0.4451 8.6866 582 0.6929 0.7769 0.6929 0.8324
0.4451 8.7164 584 0.7135 0.7769 0.7135 0.8447
0.4451 8.7463 586 0.7303 0.7769 0.7303 0.8546
0.4451 8.7761 588 0.7544 0.7769 0.7544 0.8686
0.4451 8.8060 590 0.7655 0.7614 0.7655 0.8749
0.4451 8.8358 592 0.7658 0.7614 0.7658 0.8751
0.4451 8.8657 594 0.7747 0.7614 0.7747 0.8802
0.4451 8.8955 596 0.7865 0.7614 0.7865 0.8869
0.4451 8.9254 598 0.7975 0.7614 0.7975 0.8930
0.4451 8.9552 600 0.7919 0.7614 0.7919 0.8899
0.4451 8.9851 602 0.7854 0.7614 0.7854 0.8862
0.4451 9.0149 604 0.7845 0.7614 0.7845 0.8857
0.4451 9.0448 606 0.7839 0.7614 0.7839 0.8854
0.4451 9.0746 608 0.7767 0.7614 0.7767 0.8813
0.4451 9.1045 610 0.7840 0.7614 0.7840 0.8855
0.4451 9.1343 612 0.7919 0.7614 0.7919 0.8899
0.4451 9.1642 614 0.8055 0.8123 0.8055 0.8975
0.4451 9.1940 616 0.8227 0.8123 0.8227 0.9070
0.4451 9.2239 618 0.8279 0.7689 0.8279 0.9099
0.4451 9.2537 620 0.8382 0.7689 0.8382 0.9155
0.4451 9.2836 622 0.8500 0.7689 0.8500 0.9220
0.4451 9.3134 624 0.8560 0.7689 0.8560 0.9252
0.4451 9.3433 626 0.8560 0.7689 0.8560 0.9252
0.4451 9.3731 628 0.8587 0.7196 0.8587 0.9267
0.4451 9.4030 630 0.8644 0.7196 0.8644 0.9297
0.4451 9.4328 632 0.8582 0.7196 0.8582 0.9264
0.4451 9.4627 634 0.8486 0.7196 0.8486 0.9212
0.4451 9.4925 636 0.8412 0.7196 0.8412 0.9172
0.4451 9.5224 638 0.8304 0.7196 0.8304 0.9113
0.4451 9.5522 640 0.8219 0.7614 0.8219 0.9066
0.4451 9.5821 642 0.8196 0.7614 0.8196 0.9053
0.4451 9.6119 644 0.8221 0.7614 0.8221 0.9067
0.4451 9.6418 646 0.8239 0.7614 0.8239 0.9077
0.4451 9.6716 648 0.8236 0.7614 0.8236 0.9075
0.4451 9.7015 650 0.8219 0.7614 0.8219 0.9066
0.4451 9.7313 652 0.8222 0.7614 0.8222 0.9067
0.4451 9.7612 654 0.8193 0.7614 0.8193 0.9052
0.4451 9.7910 656 0.8165 0.7614 0.8165 0.9036
0.4451 9.8209 658 0.8130 0.7614 0.8130 0.9017
0.4451 9.8507 660 0.8098 0.7614 0.8098 0.8999
0.4451 9.8806 662 0.8076 0.7614 0.8076 0.8987
0.4451 9.9104 664 0.8059 0.7614 0.8059 0.8977
0.4451 9.9403 666 0.8049 0.7614 0.8049 0.8971
0.4451 9.9701 668 0.8041 0.7614 0.8041 0.8967
0.4451 10.0 670 0.8035 0.7614 0.8035 0.8964

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
165
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/Arabic_FineTuningAraBERT_AugV4_k2_task1_organization_fold0

Finetuned
(4222)
this model