segformer-finetuned-4ss1st3r_s3gs3m_24Jan-10k-steps
This model is a fine-tuned version of nvidia/mit-b0 on the blzncz/4ss1st3r_s3gs3m_24Jan dataset. It achieves the following results on the evaluation set:
- Loss: 0.1305
- Mean Iou: 0.6564
- Mean Accuracy: 0.8562
- Overall Accuracy: 0.9780
- Accuracy Bg: nan
- Accuracy Fallo cohesivo: 0.9896
- Accuracy Fallo malla: 0.9270
- Accuracy Fallo adhesivo: 0.9478
- Accuracy Fallo burbuja: 0.5603
- Iou Bg: 0.0
- Iou Fallo cohesivo: 0.9749
- Iou Fallo malla: 0.8458
- Iou Fallo adhesivo: 0.9324
- Iou Fallo burbuja: 0.5290
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 10000
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Bg | Accuracy Fallo cohesivo | Accuracy Fallo malla | Accuracy Fallo adhesivo | Accuracy Fallo burbuja | Iou Bg | Iou Fallo cohesivo | Iou Fallo malla | Iou Fallo adhesivo | Iou Fallo burbuja |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.3639 | 1.0 | 193 | 0.1583 | 0.6076 | 0.8441 | 0.9607 | nan | 0.9660 | 0.9617 | 0.9644 | 0.4844 | 0.0 | 0.9553 | 0.7294 | 0.9301 | 0.4231 |
0.1148 | 2.0 | 386 | 0.0991 | 0.6189 | 0.8025 | 0.9754 | nan | 0.9912 | 0.9045 | 0.9417 | 0.3725 | 0.0 | 0.9723 | 0.8404 | 0.9283 | 0.3534 |
0.0937 | 3.0 | 579 | 0.1414 | 0.5848 | 0.8155 | 0.9554 | nan | 0.9606 | 0.9630 | 0.9707 | 0.3675 | 0.0 | 0.9487 | 0.6791 | 0.9442 | 0.3519 |
0.0827 | 4.0 | 772 | 0.1028 | 0.6390 | 0.8484 | 0.9747 | nan | 0.9831 | 0.9530 | 0.9640 | 0.4936 | 0.0 | 0.9714 | 0.8231 | 0.9388 | 0.4617 |
0.0735 | 5.0 | 965 | 0.0948 | 0.6425 | 0.8423 | 0.9777 | nan | 0.9875 | 0.9487 | 0.9594 | 0.4737 | 0.0 | 0.9745 | 0.8484 | 0.9415 | 0.4479 |
0.0716 | 6.0 | 1158 | 0.0968 | 0.6638 | 0.8622 | 0.9804 | nan | 0.9936 | 0.8987 | 0.9579 | 0.5985 | 0.0 | 0.9777 | 0.8654 | 0.9403 | 0.5355 |
0.0692 | 7.0 | 1351 | 0.1123 | 0.6389 | 0.8535 | 0.9718 | nan | 0.9804 | 0.9425 | 0.9604 | 0.5307 | 0.0 | 0.9678 | 0.7878 | 0.9403 | 0.4984 |
0.0718 | 8.0 | 1544 | 0.1097 | 0.6424 | 0.8668 | 0.9703 | nan | 0.9770 | 0.9520 | 0.9642 | 0.5738 | 0.0 | 0.9663 | 0.7792 | 0.9423 | 0.5243 |
0.0613 | 9.0 | 1737 | 0.1212 | 0.6341 | 0.8625 | 0.9669 | nan | 0.9735 | 0.9412 | 0.9721 | 0.5634 | 0.0 | 0.9621 | 0.7447 | 0.9430 | 0.5208 |
0.06 | 10.0 | 1930 | 0.0983 | 0.6724 | 0.8945 | 0.9793 | nan | 0.9875 | 0.9335 | 0.9682 | 0.6889 | 0.0 | 0.9765 | 0.8490 | 0.9461 | 0.5905 |
0.0593 | 11.0 | 2123 | 0.1104 | 0.6577 | 0.8803 | 0.9743 | nan | 0.9830 | 0.9249 | 0.9670 | 0.6462 | 0.0 | 0.9709 | 0.8028 | 0.9419 | 0.5729 |
0.056 | 12.0 | 2316 | 0.1029 | 0.6589 | 0.8829 | 0.9755 | nan | 0.9833 | 0.9349 | 0.9712 | 0.6420 | 0.0 | 0.9721 | 0.8170 | 0.9399 | 0.5655 |
0.0547 | 13.0 | 2509 | 0.1037 | 0.6613 | 0.8944 | 0.9746 | nan | 0.9815 | 0.9406 | 0.9680 | 0.6877 | 0.0 | 0.9712 | 0.8089 | 0.9434 | 0.5832 |
0.0538 | 14.0 | 2702 | 0.1342 | 0.6338 | 0.8750 | 0.9625 | nan | 0.9677 | 0.9470 | 0.9647 | 0.6204 | 0.0 | 0.9570 | 0.7080 | 0.9412 | 0.5627 |
0.052 | 15.0 | 2895 | 0.0961 | 0.6525 | 0.8507 | 0.9787 | nan | 0.9894 | 0.9292 | 0.9656 | 0.5187 | 0.0 | 0.9758 | 0.8514 | 0.9439 | 0.4915 |
0.0489 | 16.0 | 3088 | 0.1093 | 0.6464 | 0.8626 | 0.9725 | nan | 0.9812 | 0.9345 | 0.9639 | 0.5708 | 0.0 | 0.9688 | 0.7900 | 0.9440 | 0.5290 |
0.0478 | 17.0 | 3281 | 0.1053 | 0.6503 | 0.8574 | 0.9760 | nan | 0.9858 | 0.9300 | 0.9673 | 0.5465 | 0.0 | 0.9726 | 0.8239 | 0.9411 | 0.5138 |
0.048 | 18.0 | 3474 | 0.1314 | 0.6416 | 0.8884 | 0.9644 | nan | 0.9691 | 0.9517 | 0.9642 | 0.6688 | 0.0 | 0.9591 | 0.7232 | 0.9415 | 0.5842 |
0.0474 | 19.0 | 3667 | 0.1197 | 0.6473 | 0.8559 | 0.9743 | nan | 0.9842 | 0.9344 | 0.9557 | 0.5493 | 0.0 | 0.9707 | 0.8067 | 0.9394 | 0.5196 |
0.0456 | 20.0 | 3860 | 0.1149 | 0.6587 | 0.8578 | 0.9788 | nan | 0.9905 | 0.9241 | 0.9503 | 0.5665 | 0.0 | 0.9759 | 0.8513 | 0.9344 | 0.5321 |
0.044 | 21.0 | 4053 | 0.1183 | 0.6574 | 0.8612 | 0.9774 | nan | 0.9885 | 0.9280 | 0.9487 | 0.5794 | 0.0 | 0.9743 | 0.8367 | 0.9345 | 0.5413 |
0.0431 | 22.0 | 4246 | 0.1326 | 0.6425 | 0.8599 | 0.9711 | nan | 0.9795 | 0.9405 | 0.9595 | 0.5601 | 0.0 | 0.9670 | 0.7783 | 0.9384 | 0.5291 |
0.0446 | 23.0 | 4439 | 0.1253 | 0.6535 | 0.8678 | 0.9743 | nan | 0.9833 | 0.9309 | 0.9635 | 0.5933 | 0.0 | 0.9706 | 0.8007 | 0.9427 | 0.5535 |
0.0427 | 24.0 | 4632 | 0.1075 | 0.6568 | 0.8602 | 0.9771 | nan | 0.9882 | 0.9229 | 0.9543 | 0.5755 | 0.0 | 0.9739 | 0.8342 | 0.9379 | 0.5379 |
0.0417 | 25.0 | 4825 | 0.1250 | 0.6443 | 0.8559 | 0.9723 | nan | 0.9820 | 0.9337 | 0.9542 | 0.5539 | 0.0 | 0.9684 | 0.7904 | 0.9375 | 0.5250 |
0.0402 | 26.0 | 5018 | 0.1206 | 0.6518 | 0.8497 | 0.9775 | nan | 0.9892 | 0.9236 | 0.9536 | 0.5324 | 0.0 | 0.9744 | 0.8373 | 0.9383 | 0.5089 |
0.0403 | 27.0 | 5211 | 0.1164 | 0.6565 | 0.8688 | 0.9755 | nan | 0.9848 | 0.9382 | 0.9531 | 0.5991 | 0.0 | 0.9723 | 0.8183 | 0.9378 | 0.5540 |
0.0405 | 28.0 | 5404 | 0.1091 | 0.6586 | 0.8505 | 0.9799 | nan | 0.9926 | 0.9177 | 0.9530 | 0.5389 | 0.0 | 0.9773 | 0.8650 | 0.9381 | 0.5128 |
0.0384 | 29.0 | 5597 | 0.1304 | 0.6504 | 0.8470 | 0.9781 | nan | 0.9893 | 0.9365 | 0.9508 | 0.5112 | 0.0 | 0.9751 | 0.8477 | 0.9365 | 0.4926 |
0.0374 | 30.0 | 5790 | 0.1095 | 0.6585 | 0.8605 | 0.9783 | nan | 0.9891 | 0.9323 | 0.9507 | 0.5698 | 0.0 | 0.9754 | 0.8469 | 0.9358 | 0.5345 |
0.0378 | 31.0 | 5983 | 0.1245 | 0.6558 | 0.8553 | 0.9780 | nan | 0.9896 | 0.9237 | 0.9539 | 0.5540 | 0.0 | 0.9750 | 0.8435 | 0.9353 | 0.5254 |
0.0367 | 32.0 | 6176 | 0.1288 | 0.6504 | 0.8637 | 0.9737 | nan | 0.9828 | 0.9386 | 0.9555 | 0.5778 | 0.0 | 0.9700 | 0.8016 | 0.9362 | 0.5443 |
0.037 | 33.0 | 6369 | 0.1293 | 0.6565 | 0.8656 | 0.9760 | nan | 0.9862 | 0.9381 | 0.9443 | 0.5938 | 0.0 | 0.9726 | 0.8273 | 0.9314 | 0.5512 |
0.0363 | 34.0 | 6562 | 0.1242 | 0.6594 | 0.8528 | 0.9800 | nan | 0.9926 | 0.9171 | 0.9529 | 0.5485 | 0.0 | 0.9773 | 0.8632 | 0.9378 | 0.5188 |
0.0361 | 35.0 | 6755 | 0.1239 | 0.6653 | 0.8739 | 0.9781 | nan | 0.9886 | 0.9247 | 0.9557 | 0.6264 | 0.0 | 0.9752 | 0.8420 | 0.9374 | 0.5718 |
0.0371 | 36.0 | 6948 | 0.1220 | 0.6626 | 0.8691 | 0.9782 | nan | 0.9887 | 0.9297 | 0.9530 | 0.6049 | 0.0 | 0.9751 | 0.8418 | 0.9375 | 0.5585 |
0.034 | 37.0 | 7141 | 0.1694 | 0.6300 | 0.8685 | 0.9609 | nan | 0.9666 | 0.9453 | 0.9602 | 0.6020 | 0.0 | 0.9551 | 0.6981 | 0.9399 | 0.5567 |
0.0358 | 38.0 | 7334 | 0.1251 | 0.6513 | 0.8534 | 0.9764 | nan | 0.9878 | 0.9270 | 0.9492 | 0.5497 | 0.0 | 0.9731 | 0.8290 | 0.9345 | 0.5198 |
0.033 | 39.0 | 7527 | 0.1330 | 0.6542 | 0.8604 | 0.9764 | nan | 0.9868 | 0.9343 | 0.9503 | 0.5700 | 0.0 | 0.9731 | 0.8292 | 0.9351 | 0.5336 |
0.0327 | 40.0 | 7720 | 0.1359 | 0.6490 | 0.8537 | 0.9750 | nan | 0.9862 | 0.9269 | 0.9483 | 0.5535 | 0.0 | 0.9716 | 0.8183 | 0.9330 | 0.5221 |
0.0336 | 41.0 | 7913 | 0.1277 | 0.6588 | 0.8667 | 0.9766 | nan | 0.9874 | 0.9267 | 0.9489 | 0.6037 | 0.0 | 0.9734 | 0.8288 | 0.9341 | 0.5577 |
0.0312 | 42.0 | 8106 | 0.1321 | 0.6568 | 0.8716 | 0.9749 | nan | 0.9844 | 0.9358 | 0.9500 | 0.6163 | 0.0 | 0.9714 | 0.8132 | 0.9344 | 0.5650 |
0.0321 | 43.0 | 8299 | 0.1269 | 0.6533 | 0.8574 | 0.9763 | nan | 0.9874 | 0.9283 | 0.9490 | 0.5649 | 0.0 | 0.9730 | 0.8285 | 0.9335 | 0.5316 |
0.0306 | 44.0 | 8492 | 0.1269 | 0.6583 | 0.8528 | 0.9792 | nan | 0.9918 | 0.9207 | 0.9467 | 0.5520 | 0.0 | 0.9764 | 0.8593 | 0.9324 | 0.5236 |
0.0306 | 45.0 | 8685 | 0.1335 | 0.6503 | 0.8503 | 0.9765 | nan | 0.9883 | 0.9283 | 0.9439 | 0.5407 | 0.0 | 0.9733 | 0.8345 | 0.9295 | 0.5144 |
0.0324 | 46.0 | 8878 | 0.1294 | 0.6538 | 0.8490 | 0.9784 | nan | 0.9908 | 0.9254 | 0.9441 | 0.5358 | 0.0 | 0.9754 | 0.8525 | 0.9303 | 0.5107 |
0.0318 | 47.0 | 9071 | 0.1230 | 0.6564 | 0.8549 | 0.9782 | nan | 0.9900 | 0.9252 | 0.9486 | 0.5559 | 0.0 | 0.9752 | 0.8477 | 0.9335 | 0.5255 |
0.0319 | 48.0 | 9264 | 0.1267 | 0.6524 | 0.8501 | 0.9776 | nan | 0.9895 | 0.9278 | 0.9464 | 0.5368 | 0.0 | 0.9745 | 0.8438 | 0.9322 | 0.5117 |
0.0312 | 49.0 | 9457 | 0.1258 | 0.6568 | 0.8602 | 0.9774 | nan | 0.9884 | 0.9321 | 0.9482 | 0.5720 | 0.0 | 0.9743 | 0.8399 | 0.9327 | 0.5373 |
0.0311 | 50.0 | 9650 | 0.1203 | 0.6589 | 0.8610 | 0.9779 | nan | 0.9894 | 0.9262 | 0.9471 | 0.5814 | 0.0 | 0.9749 | 0.8444 | 0.9319 | 0.5435 |
0.0327 | 51.0 | 9843 | 0.1219 | 0.6575 | 0.8577 | 0.9780 | nan | 0.9897 | 0.9265 | 0.9457 | 0.5688 | 0.0 | 0.9750 | 0.8462 | 0.9314 | 0.5348 |
0.031 | 51.81 | 10000 | 0.1305 | 0.6564 | 0.8562 | 0.9780 | nan | 0.9896 | 0.9270 | 0.9478 | 0.5603 | 0.0 | 0.9749 | 0.8458 | 0.9324 | 0.5290 |
Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cpu
- Datasets 2.13.1
- Tokenizers 0.13.3
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.