mit-b0_corm

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0433
  • Mean Iou: 0.9210
  • Mean Accuracy: 0.9571
  • Overall Accuracy: 0.9853
  • Accuracy Background: 0.9977
  • Accuracy Corm: 0.9360
  • Accuracy Damage: 0.9377
  • Iou Background: 0.9944
  • Iou Corm: 0.8762
  • Iou Damage: 0.8923

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Corm Accuracy Damage Iou Background Iou Corm Iou Damage
0.933 0.6061 20 1.0299 0.3591 0.6054 0.6910 0.7236 0.1098 0.9827 0.7236 0.0867 0.2671
0.6505 1.2121 40 0.6909 0.6522 0.8240 0.9013 0.9328 0.5651 0.9740 0.9328 0.4509 0.5728
0.4133 1.8182 60 0.4184 0.7567 0.8872 0.9394 0.9609 0.7307 0.9701 0.9607 0.6218 0.6875
0.3299 2.4242 80 0.3451 0.8351 0.9306 0.9617 0.9751 0.8924 0.9243 0.9748 0.7569 0.7735
0.2594 3.0303 100 0.2506 0.8703 0.9412 0.9727 0.9862 0.8989 0.9384 0.9852 0.8019 0.8237
0.2253 3.6364 120 0.2006 0.8851 0.9403 0.9779 0.9939 0.8672 0.9599 0.9915 0.8207 0.8430
0.2222 4.2424 140 0.1654 0.8990 0.9490 0.9805 0.9946 0.9446 0.9079 0.9920 0.8438 0.8612
0.1347 4.8485 160 0.1413 0.9048 0.9508 0.9819 0.9956 0.9334 0.9234 0.9928 0.8526 0.8689
0.1366 5.4545 180 0.1155 0.9094 0.9516 0.9829 0.9966 0.9258 0.9325 0.9933 0.8583 0.8765
0.1121 6.0606 200 0.1086 0.8938 0.9447 0.9801 0.9961 0.9628 0.8753 0.9933 0.8392 0.8487
0.0982 6.6667 220 0.0963 0.9115 0.9524 0.9835 0.9972 0.9374 0.9227 0.9938 0.8626 0.8780
0.0993 7.2727 240 0.0892 0.9094 0.9513 0.9832 0.9968 0.9001 0.9571 0.9940 0.8570 0.8773
0.0813 7.8788 260 0.0842 0.9127 0.9543 0.9837 0.9966 0.9380 0.9281 0.9939 0.8643 0.8798
0.1059 8.4848 280 0.0774 0.9152 0.9541 0.9842 0.9973 0.9258 0.9391 0.9940 0.8673 0.8843
0.082 9.0909 300 0.0729 0.9159 0.9541 0.9843 0.9975 0.9294 0.9355 0.9940 0.8681 0.8854
0.0725 9.6970 320 0.0692 0.9162 0.9544 0.9844 0.9975 0.9247 0.9411 0.9941 0.8686 0.8861
0.0814 10.3030 340 0.0687 0.9161 0.9541 0.9844 0.9975 0.9155 0.9492 0.9942 0.8675 0.8865
0.076 10.9091 360 0.0640 0.9157 0.9555 0.9843 0.9968 0.9219 0.9479 0.9941 0.8680 0.8849
0.07 11.5152 380 0.0633 0.9166 0.9553 0.9845 0.9973 0.9375 0.9310 0.9941 0.8698 0.8859
0.0674 12.1212 400 0.0611 0.9176 0.9549 0.9847 0.9977 0.9217 0.9453 0.9943 0.8704 0.8881
0.0638 12.7273 420 0.0601 0.9116 0.9522 0.9836 0.9977 0.9529 0.9059 0.9941 0.8641 0.8768
0.0566 13.3333 440 0.0582 0.9176 0.9561 0.9847 0.9972 0.9322 0.9387 0.9943 0.8714 0.8872
0.0582 13.9394 460 0.0614 0.9077 0.9502 0.9829 0.9976 0.9583 0.8948 0.9941 0.8588 0.8700
0.0555 14.5455 480 0.0561 0.9146 0.9534 0.9841 0.9978 0.9481 0.9142 0.9941 0.8679 0.8817
0.053 15.1515 500 0.0540 0.9182 0.9551 0.9848 0.9977 0.9185 0.9492 0.9943 0.8707 0.8895
0.059 15.7576 520 0.0549 0.9180 0.9565 0.9848 0.9970 0.9248 0.9478 0.9943 0.8711 0.8887
0.0484 16.3636 540 0.0529 0.9177 0.9563 0.9847 0.9973 0.9405 0.9311 0.9943 0.8721 0.8866
0.0559 16.9697 560 0.0510 0.9192 0.9565 0.9850 0.9974 0.9268 0.9453 0.9943 0.8729 0.8904
0.0542 17.5758 580 0.0512 0.9190 0.9569 0.9850 0.9973 0.9351 0.9382 0.9944 0.8733 0.8894
0.0451 18.1818 600 0.0505 0.9184 0.9557 0.9848 0.9977 0.9428 0.9265 0.9943 0.8729 0.8880
0.05 18.7879 620 0.0499 0.9178 0.9542 0.9848 0.9979 0.9098 0.9549 0.9943 0.8691 0.8899
0.063 19.3939 640 0.0491 0.9190 0.9560 0.9850 0.9975 0.9221 0.9483 0.9943 0.8723 0.8904
0.0484 20.0 660 0.0501 0.9185 0.9569 0.9849 0.9972 0.9427 0.9308 0.9944 0.8732 0.8880
0.0527 20.6061 680 0.0492 0.9186 0.9561 0.9849 0.9976 0.9430 0.9276 0.9943 0.8732 0.8884
0.0583 21.2121 700 0.0476 0.9195 0.9563 0.9851 0.9976 0.9208 0.9506 0.9944 0.8730 0.8911
0.0557 21.8182 720 0.0488 0.9188 0.9565 0.9850 0.9973 0.9191 0.9531 0.9945 0.8723 0.8896
0.0458 22.4242 740 0.0481 0.9194 0.9568 0.9851 0.9973 0.9242 0.9489 0.9944 0.8729 0.8909
0.042 23.0303 760 0.0472 0.9202 0.9570 0.9852 0.9975 0.9326 0.9409 0.9944 0.8749 0.8911
0.0459 23.6364 780 0.0468 0.9191 0.9565 0.9850 0.9976 0.9423 0.9295 0.9944 0.8740 0.8889
0.0491 24.2424 800 0.0464 0.9204 0.9568 0.9852 0.9977 0.9361 0.9366 0.9944 0.8753 0.8914
0.0548 24.8485 820 0.0454 0.9201 0.9565 0.9852 0.9976 0.9244 0.9475 0.9944 0.8740 0.8917
0.0447 25.4545 840 0.0473 0.9176 0.9558 0.9847 0.9976 0.9477 0.9222 0.9944 0.8723 0.8863
0.0457 26.0606 860 0.0468 0.9203 0.9567 0.9852 0.9976 0.9270 0.9456 0.9944 0.8745 0.8922
0.0468 26.6667 880 0.0454 0.9201 0.9572 0.9852 0.9974 0.9403 0.9341 0.9944 0.8753 0.8905
0.0433 27.2727 900 0.0452 0.9208 0.9563 0.9853 0.9980 0.9339 0.9371 0.9943 0.8759 0.8923
0.0438 27.8788 920 0.0452 0.9208 0.9574 0.9853 0.9975 0.9352 0.9396 0.9944 0.8760 0.8920
0.0446 28.4848 940 0.0447 0.9210 0.9568 0.9853 0.9978 0.9349 0.9377 0.9943 0.8760 0.8926
0.0492 29.0909 960 0.0452 0.9211 0.9568 0.9853 0.9978 0.9352 0.9374 0.9943 0.8762 0.8928
0.0481 29.6970 980 0.0456 0.9195 0.9567 0.9851 0.9976 0.9443 0.9283 0.9944 0.8747 0.8893
0.0405 30.3030 1000 0.0447 0.9206 0.9574 0.9853 0.9975 0.9391 0.9355 0.9944 0.8758 0.8916
0.0505 30.9091 1020 0.0443 0.9210 0.9570 0.9853 0.9978 0.9370 0.9364 0.9944 0.8763 0.8923
0.047 31.5152 1040 0.0450 0.9204 0.9568 0.9853 0.9976 0.9223 0.9505 0.9945 0.8744 0.8923
0.0548 32.1212 1060 0.0452 0.9192 0.9561 0.9850 0.9978 0.9442 0.9261 0.9944 0.8744 0.8889
0.0445 32.7273 1080 0.0442 0.9208 0.9573 0.9853 0.9975 0.9320 0.9426 0.9944 0.8758 0.8921
0.0539 33.3333 1100 0.0435 0.9208 0.9571 0.9853 0.9976 0.9359 0.9379 0.9944 0.8758 0.8921
0.0383 33.9394 1120 0.0459 0.9171 0.9549 0.9846 0.9979 0.9493 0.9175 0.9943 0.8716 0.8853
0.0478 34.5455 1140 0.0443 0.9203 0.9572 0.9852 0.9974 0.9246 0.9496 0.9945 0.8748 0.8916
0.0432 35.1515 1160 0.0442 0.9210 0.9571 0.9853 0.9977 0.9349 0.9388 0.9944 0.8762 0.8924
0.0468 35.7576 1180 0.0439 0.9208 0.9572 0.9853 0.9976 0.9371 0.9368 0.9944 0.8761 0.8919
0.0475 36.3636 1200 0.0443 0.9209 0.9571 0.9853 0.9977 0.9371 0.9364 0.9944 0.8762 0.8921
0.0388 36.9697 1220 0.0436 0.9208 0.9573 0.9853 0.9976 0.9371 0.9373 0.9944 0.8761 0.8919
0.0468 37.5758 1240 0.0431 0.9208 0.9574 0.9853 0.9975 0.9343 0.9405 0.9944 0.8760 0.8921
0.0426 38.1818 1260 0.0445 0.9205 0.9570 0.9852 0.9977 0.9415 0.9318 0.9944 0.8758 0.8912
0.0549 38.7879 1280 0.0436 0.9209 0.9571 0.9853 0.9977 0.9373 0.9362 0.9944 0.8761 0.8921
0.045 39.3939 1300 0.0438 0.9208 0.9573 0.9853 0.9976 0.9381 0.9362 0.9944 0.8760 0.8919
0.0287 40.0 1320 0.0433 0.9210 0.9571 0.9853 0.9977 0.9360 0.9377 0.9944 0.8762 0.8923

Framework versions

  • Transformers 4.44.1
  • Pytorch 2.6.0+cpu
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
14
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for mujerry/mit-b0_corm

Base model

nvidia/mit-b0
Finetuned
(380)
this model