segformer-b2-finetuned-ade-512-512_corm
This model is a fine-tuned version of nvidia/segformer-b2-finetuned-ade-512-512 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0415
- Mean Iou: 0.9264
- Mean Accuracy: 0.9599
- Overall Accuracy: 0.9860
- Accuracy Background: 0.9978
- Accuracy Corm: 0.9362
- Accuracy Damage: 0.9456
- Iou Background: 0.9942
- Iou Corm: 0.8799
- Iou Damage: 0.9052
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Corm | Accuracy Damage | Iou Background | Iou Corm | Iou Damage |
---|---|---|---|---|---|---|---|---|---|---|---|---|
0.8746 | 0.9524 | 20 | 0.8170 | 0.4637 | 0.6489 | 0.8404 | 0.9133 | 0.0390 | 0.9944 | 0.9132 | 0.0327 | 0.4451 |
0.61 | 1.9048 | 40 | 0.4500 | 0.7608 | 0.8748 | 0.9451 | 0.9731 | 0.6946 | 0.9566 | 0.9730 | 0.6023 | 0.7071 |
0.3681 | 2.8571 | 60 | 0.2802 | 0.8597 | 0.9314 | 0.9711 | 0.9879 | 0.8621 | 0.9443 | 0.9873 | 0.7716 | 0.8201 |
0.2433 | 3.8095 | 80 | 0.2201 | 0.8866 | 0.9456 | 0.9774 | 0.9916 | 0.9159 | 0.9293 | 0.9904 | 0.8181 | 0.8513 |
0.1669 | 4.7619 | 100 | 0.1607 | 0.8891 | 0.9431 | 0.9783 | 0.9930 | 0.8723 | 0.9640 | 0.9915 | 0.8178 | 0.8580 |
0.1484 | 5.7143 | 120 | 0.1250 | 0.9017 | 0.9489 | 0.9812 | 0.9964 | 0.9423 | 0.9080 | 0.9932 | 0.8432 | 0.8688 |
0.1126 | 6.6667 | 140 | 0.1024 | 0.9092 | 0.9524 | 0.9827 | 0.9962 | 0.9247 | 0.9363 | 0.9934 | 0.8539 | 0.8804 |
0.0909 | 7.6190 | 160 | 0.0932 | 0.9017 | 0.9492 | 0.9813 | 0.9968 | 0.9563 | 0.8944 | 0.9935 | 0.8445 | 0.8670 |
0.0994 | 8.5714 | 180 | 0.0803 | 0.9122 | 0.9527 | 0.9833 | 0.9967 | 0.9118 | 0.9495 | 0.9936 | 0.8568 | 0.8861 |
0.0768 | 9.5238 | 200 | 0.0716 | 0.9147 | 0.9533 | 0.9838 | 0.9975 | 0.9247 | 0.9376 | 0.9937 | 0.8615 | 0.8889 |
0.0749 | 10.4762 | 220 | 0.0671 | 0.9177 | 0.9550 | 0.9844 | 0.9973 | 0.9191 | 0.9487 | 0.9939 | 0.8661 | 0.8932 |
0.0663 | 11.4286 | 240 | 0.0668 | 0.9097 | 0.9528 | 0.9829 | 0.9973 | 0.9528 | 0.9083 | 0.9939 | 0.8558 | 0.8795 |
0.0725 | 12.3810 | 260 | 0.0608 | 0.9189 | 0.9554 | 0.9847 | 0.9974 | 0.9123 | 0.9564 | 0.9940 | 0.8677 | 0.8951 |
0.0594 | 13.3333 | 280 | 0.0588 | 0.9167 | 0.9533 | 0.9843 | 0.9975 | 0.9000 | 0.9625 | 0.9940 | 0.8622 | 0.8940 |
0.062 | 14.2857 | 300 | 0.0552 | 0.9201 | 0.9565 | 0.9849 | 0.9972 | 0.9170 | 0.9553 | 0.9941 | 0.8691 | 0.8970 |
0.0535 | 15.2381 | 320 | 0.0543 | 0.9195 | 0.9559 | 0.9848 | 0.9972 | 0.9078 | 0.9626 | 0.9942 | 0.8683 | 0.8962 |
0.0555 | 16.1905 | 340 | 0.0517 | 0.9212 | 0.9566 | 0.9851 | 0.9973 | 0.9113 | 0.9612 | 0.9942 | 0.8704 | 0.8990 |
0.0553 | 17.1429 | 360 | 0.0513 | 0.9198 | 0.9553 | 0.9849 | 0.9975 | 0.9047 | 0.9638 | 0.9942 | 0.8679 | 0.8974 |
0.0572 | 18.0952 | 380 | 0.0501 | 0.9219 | 0.9563 | 0.9853 | 0.9977 | 0.9108 | 0.9603 | 0.9942 | 0.8713 | 0.9002 |
0.0503 | 19.0476 | 400 | 0.0483 | 0.9245 | 0.9573 | 0.9856 | 0.9981 | 0.9212 | 0.9525 | 0.9940 | 0.8757 | 0.9037 |
0.0539 | 20.0 | 420 | 0.0474 | 0.9245 | 0.9593 | 0.9857 | 0.9974 | 0.9309 | 0.9497 | 0.9942 | 0.8769 | 0.9024 |
0.0542 | 20.9524 | 440 | 0.0484 | 0.9202 | 0.9575 | 0.9849 | 0.9978 | 0.9511 | 0.9235 | 0.9941 | 0.8718 | 0.8949 |
0.033 | 21.9048 | 460 | 0.0478 | 0.9209 | 0.9576 | 0.9850 | 0.9977 | 0.9464 | 0.9287 | 0.9941 | 0.8726 | 0.8961 |
0.0421 | 22.8571 | 480 | 0.0452 | 0.9247 | 0.9591 | 0.9857 | 0.9974 | 0.9244 | 0.9555 | 0.9942 | 0.8766 | 0.9033 |
0.0472 | 23.8095 | 500 | 0.0455 | 0.9243 | 0.9583 | 0.9857 | 0.9976 | 0.9231 | 0.9543 | 0.9942 | 0.8759 | 0.9028 |
0.0381 | 24.7619 | 520 | 0.0456 | 0.9233 | 0.9570 | 0.9855 | 0.9977 | 0.9109 | 0.9625 | 0.9942 | 0.8732 | 0.9026 |
0.0486 | 25.7143 | 540 | 0.0444 | 0.9249 | 0.9593 | 0.9857 | 0.9978 | 0.9408 | 0.9394 | 0.9941 | 0.8780 | 0.9026 |
0.0501 | 26.6667 | 560 | 0.0458 | 0.9208 | 0.9579 | 0.9850 | 0.9977 | 0.9508 | 0.9252 | 0.9942 | 0.8725 | 0.8957 |
0.0343 | 27.6190 | 580 | 0.0436 | 0.9251 | 0.9594 | 0.9857 | 0.9978 | 0.9413 | 0.9391 | 0.9941 | 0.8782 | 0.9031 |
0.0407 | 28.5714 | 600 | 0.0434 | 0.9251 | 0.9597 | 0.9858 | 0.9977 | 0.9416 | 0.9396 | 0.9942 | 0.8784 | 0.9028 |
0.0419 | 29.5238 | 620 | 0.0445 | 0.9221 | 0.9586 | 0.9852 | 0.9977 | 0.9496 | 0.9285 | 0.9942 | 0.8743 | 0.8978 |
0.0506 | 30.4762 | 640 | 0.0425 | 0.9262 | 0.9593 | 0.9860 | 0.9978 | 0.9311 | 0.9491 | 0.9942 | 0.8791 | 0.9053 |
0.0422 | 31.4286 | 660 | 0.0424 | 0.9262 | 0.9595 | 0.9860 | 0.9977 | 0.9267 | 0.9540 | 0.9942 | 0.8790 | 0.9054 |
0.0362 | 32.3810 | 680 | 0.0425 | 0.9258 | 0.9600 | 0.9859 | 0.9977 | 0.9402 | 0.9421 | 0.9942 | 0.8793 | 0.9039 |
0.0437 | 33.3333 | 700 | 0.0424 | 0.9262 | 0.9599 | 0.9860 | 0.9978 | 0.9377 | 0.9441 | 0.9942 | 0.8796 | 0.9047 |
0.0363 | 34.2857 | 720 | 0.0415 | 0.9264 | 0.9602 | 0.9860 | 0.9976 | 0.9367 | 0.9463 | 0.9942 | 0.8800 | 0.9049 |
0.039 | 35.2381 | 740 | 0.0421 | 0.9267 | 0.9596 | 0.9861 | 0.9978 | 0.9290 | 0.9521 | 0.9942 | 0.8798 | 0.9060 |
0.0425 | 36.1905 | 760 | 0.0418 | 0.9259 | 0.9598 | 0.9859 | 0.9978 | 0.9391 | 0.9426 | 0.9942 | 0.8794 | 0.9040 |
0.0462 | 37.1429 | 780 | 0.0417 | 0.9267 | 0.9600 | 0.9861 | 0.9976 | 0.9311 | 0.9513 | 0.9942 | 0.8801 | 0.9057 |
0.0466 | 38.0952 | 800 | 0.0416 | 0.9261 | 0.9599 | 0.9860 | 0.9978 | 0.9392 | 0.9427 | 0.9942 | 0.8795 | 0.9045 |
0.0428 | 39.0476 | 820 | 0.0414 | 0.9266 | 0.9598 | 0.9861 | 0.9978 | 0.9323 | 0.9494 | 0.9942 | 0.8800 | 0.9057 |
0.04 | 40.0 | 840 | 0.0415 | 0.9264 | 0.9599 | 0.9860 | 0.9978 | 0.9362 | 0.9456 | 0.9942 | 0.8799 | 0.9052 |
Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for mujerry/segformer-b2-finetuned-ade-512-512_corm
Base model
nvidia/segformer-b2-finetuned-ade-512-512