segformer-b0-finetuned-wrinkle

This model is a fine-tuned version of nvidia/mit-b0 on the AmirGenAI/my-wrinkle-seg-dataset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0170
  • Mean Iou: 0.1948
  • Mean Accuracy: 0.3896
  • Overall Accuracy: 0.3896
  • Accuracy Unlabeled: nan
  • Accuracy Wrinkle: 0.3896
  • Iou Unlabeled: 0.0
  • Iou Wrinkle: 0.3896

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Accuracy Unlabeled Accuracy Wrinkle Iou Unlabeled Iou Wrinkle Validation Loss Mean Accuracy Mean Iou Overall Accuracy
0.0207 0.0617 20 nan 0.3445 0.0 0.3445 0.0224 0.3445 0.1723 0.3445
0.0235 0.1235 40 nan 0.3026 0.0 0.3026 0.0214 0.3026 0.1513 0.3026
0.0159 0.1852 60 nan 0.3031 0.0 0.3031 0.0206 0.3031 0.1515 0.3031
0.0214 0.2469 80 nan 0.2827 0.0 0.2827 0.0204 0.2827 0.1413 0.2827
0.0204 0.3086 100 nan 0.2768 0.0 0.2768 0.0202 0.2768 0.1384 0.2768
0.0231 0.3704 120 nan 0.2846 0.0 0.2846 0.0200 0.2846 0.1423 0.2846
0.0151 0.4321 140 nan 0.2569 0.0 0.2569 0.0199 0.2569 0.1285 0.2569
0.0245 0.4938 160 nan 0.2752 0.0 0.2752 0.0197 0.2752 0.1376 0.2752
0.023 0.5556 180 nan 0.3322 0.0 0.3322 0.0200 0.3322 0.1661 0.3322
0.0321 0.6173 200 nan 0.3031 0.0 0.3031 0.0198 0.3031 0.1516 0.3031
0.0197 0.6790 220 nan 0.3157 0.0 0.3157 0.0196 0.3157 0.1578 0.3157
0.0192 0.7407 240 nan 0.3430 0.0 0.3430 0.0198 0.3430 0.1715 0.3430
0.0228 0.8025 260 nan 0.3144 0.0 0.3144 0.0194 0.3144 0.1572 0.3144
0.0216 0.8642 280 nan 0.3338 0.0 0.3338 0.0194 0.3338 0.1669 0.3338
0.0186 0.9259 300 nan 0.3411 0.0 0.3411 0.0193 0.3411 0.1705 0.3411
0.0202 0.9877 320 nan 0.3215 0.0 0.3215 0.0192 0.3215 0.1608 0.3215
0.0194 1.0494 340 nan 0.3286 0.0 0.3286 0.0194 0.3286 0.1643 0.3286
0.0166 1.1111 360 nan 0.2658 0.0 0.2658 0.0193 0.2658 0.1329 0.2658
0.0187 1.1728 380 nan 0.3133 0.0 0.3133 0.0191 0.3133 0.1567 0.3133
0.0208 1.2346 400 nan 0.3254 0.0 0.3254 0.0190 0.3254 0.1627 0.3254
0.0237 1.2963 420 nan 0.3093 0.0 0.3093 0.0190 0.3093 0.1546 0.3093
0.0244 1.3580 440 nan 0.3167 0.0 0.3167 0.0189 0.3167 0.1583 0.3167
0.0156 1.4198 460 nan 0.3408 0.0 0.3408 0.0191 0.3408 0.1704 0.3408
0.0231 1.4815 480 nan 0.2716 0.0 0.2716 0.0190 0.2716 0.1358 0.2716
0.0197 1.5432 500 nan 0.3540 0.0 0.3540 0.0190 0.3540 0.1770 0.3540
0.013 1.6049 520 nan 0.3358 0.0 0.3358 0.0188 0.3358 0.1679 0.3358
0.0158 1.6667 540 nan 0.3619 0.0 0.3619 0.0188 0.3619 0.1810 0.3619
0.0212 1.7284 560 nan 0.3649 0.0 0.3649 0.0188 0.3649 0.1824 0.3649
0.0231 1.7901 580 nan 0.2906 0.0 0.2906 0.0189 0.2906 0.1453 0.2906
0.0158 1.8519 600 nan 0.3210 0.0 0.3210 0.0187 0.3210 0.1605 0.3210
0.024 1.9136 620 nan 0.3117 0.0 0.3117 0.0187 0.3117 0.1558 0.3117
0.0193 1.9753 640 nan 0.3596 0.0 0.3596 0.0187 0.3596 0.1798 0.3596
0.0201 2.0370 660 nan 0.3628 0.0 0.3628 0.0187 0.3628 0.1814 0.3628
0.0162 2.0988 680 nan 0.3238 0.0 0.3238 0.0185 0.3238 0.1619 0.3238
0.0219 2.1605 700 nan 0.3622 0.0 0.3622 0.0186 0.3622 0.1811 0.3622
0.0207 2.2222 720 nan 0.3919 0.0 0.3919 0.0189 0.3919 0.1959 0.3919
0.0176 2.2840 740 nan 0.3116 0.0 0.3116 0.0185 0.3116 0.1558 0.3116
0.0149 2.3457 760 nan 0.3917 0.0 0.3917 0.0186 0.3917 0.1959 0.3917
0.0192 2.4074 780 nan 0.3097 0.0 0.3097 0.0184 0.3097 0.1548 0.3097
0.0202 2.4691 800 nan 0.3745 0.0 0.3745 0.0183 0.3745 0.1872 0.3745
0.0181 2.5309 820 nan 0.3687 0.0 0.3687 0.0185 0.3687 0.1844 0.3687
0.0168 2.5926 840 nan 0.3629 0.0 0.3629 0.0183 0.3629 0.1815 0.3629
0.0171 2.6543 860 nan 0.3148 0.0 0.3148 0.0183 0.3148 0.1574 0.3148
0.0166 2.7160 880 nan 0.3426 0.0 0.3426 0.0182 0.3426 0.1713 0.3426
0.0123 2.7778 900 nan 0.3670 0.0 0.3670 0.0183 0.3670 0.1835 0.3670
0.0149 2.8395 920 nan 0.3535 0.0 0.3535 0.0182 0.3535 0.1767 0.3535
0.016 2.9012 940 nan 0.3856 0.0 0.3856 0.0183 0.3856 0.1928 0.3856
0.0192 2.9630 960 nan 0.3647 0.0 0.3647 0.0182 0.3647 0.1823 0.3647
0.0188 3.0247 980 nan 0.3313 0.0 0.3313 0.0182 0.3313 0.1657 0.3313
0.0203 3.0864 1000 nan 0.3233 0.0 0.3233 0.0181 0.3233 0.1617 0.3233
0.0152 3.1481 1020 nan 0.3047 0.0 0.3047 0.0182 0.3047 0.1523 0.3047
0.0175 3.2099 1040 nan 0.3545 0.0 0.3545 0.0181 0.3545 0.1772 0.3545
0.0241 3.2716 1060 nan 0.3898 0.0 0.3898 0.0182 0.3898 0.1949 0.3898
0.0215 3.3333 1080 nan 0.3102 0.0 0.3102 0.0181 0.3102 0.1551 0.3102
0.0205 3.3951 1100 nan 0.3581 0.0 0.3581 0.0180 0.3581 0.1791 0.3581
0.0208 3.4568 1120 nan 0.3611 0.0 0.3611 0.0179 0.3611 0.1805 0.3611
0.0184 3.5185 1140 nan 0.3151 0.0 0.3151 0.0180 0.3151 0.1576 0.3151
0.0159 3.5802 1160 nan 0.4039 0.0 0.4039 0.0182 0.4039 0.2019 0.4039
0.0219 3.6420 1180 nan 0.3444 0.0 0.3444 0.0181 0.3444 0.1722 0.3444
0.023 3.7037 1200 nan 0.3265 0.0 0.3265 0.0179 0.3265 0.1633 0.3265
0.0148 3.7654 1220 nan 0.3677 0.0 0.3677 0.0179 0.3677 0.1838 0.3677
0.019 3.8272 1240 nan 0.3422 0.0 0.3422 0.0178 0.3422 0.1711 0.3422
0.0154 3.8889 1260 nan 0.3611 0.0 0.3611 0.0179 0.3611 0.1806 0.3611
0.0212 3.9506 1280 nan 0.3274 0.0 0.3274 0.0178 0.3274 0.1637 0.3274
0.0192 4.0123 1300 nan 0.3829 0.0 0.3829 0.0178 0.3829 0.1914 0.3829
0.0183 4.0741 1320 nan 0.4120 0.0 0.4120 0.0180 0.4120 0.2060 0.4120
0.0141 4.1358 1340 nan 0.3613 0.0 0.3613 0.0177 0.3613 0.1806 0.3613
0.0165 4.1975 1360 nan 0.3552 0.0 0.3552 0.0180 0.3552 0.1776 0.3552
0.018 4.2593 1380 nan 0.3646 0.0 0.3646 0.0177 0.3646 0.1823 0.3646
0.0166 4.3210 1400 nan 0.3557 0.0 0.3557 0.0177 0.3557 0.1778 0.3557
0.0193 4.3827 1420 nan 0.3518 0.0 0.3518 0.0177 0.3518 0.1759 0.3518
0.0173 4.4444 1440 nan 0.3640 0.0 0.3640 0.0176 0.3640 0.1820 0.3640
0.0159 4.5062 1460 nan 0.4083 0.0 0.4083 0.0178 0.4083 0.2042 0.4083
0.0186 4.5679 1480 nan 0.3679 0.0 0.3679 0.0177 0.3679 0.1839 0.3679
0.0214 4.6296 1500 nan 0.3567 0.0 0.3567 0.0176 0.3567 0.1783 0.3567
0.0192 4.6914 1520 nan 0.3595 0.0 0.3595 0.0176 0.3595 0.1797 0.3595
0.0173 4.7531 1540 nan 0.3567 0.0 0.3567 0.0176 0.3567 0.1784 0.3567
0.0185 4.8148 1560 nan 0.3830 0.0 0.3830 0.0176 0.3830 0.1915 0.3830
0.0179 4.8765 1580 nan 0.3851 0.0 0.3851 0.0176 0.3851 0.1926 0.3851
0.0152 4.9383 1600 nan 0.4017 0.0 0.4017 0.0177 0.4017 0.2009 0.4017
0.0281 5.0 1620 nan 0.3503 0.0 0.3503 0.0176 0.3503 0.1752 0.3503
0.0185 5.0617 1640 nan 0.4148 0.0 0.4148 0.0178 0.4148 0.2074 0.4148
0.0209 5.1235 1660 nan 0.3782 0.0 0.3782 0.0176 0.3782 0.1891 0.3782
0.0191 5.1852 1680 nan 0.3441 0.0 0.3441 0.0175 0.3441 0.1720 0.3441
0.0134 5.2469 1700 nan 0.3189 0.0 0.3189 0.0176 0.3189 0.1594 0.3189
0.0178 5.3086 1720 nan 0.3452 0.0 0.3452 0.0176 0.3452 0.1726 0.3452
0.0214 5.3704 1740 nan 0.4143 0.0 0.4143 0.0177 0.4143 0.2072 0.4143
0.0179 5.4321 1760 nan 0.3732 0.0 0.3732 0.0175 0.3732 0.1866 0.3732
0.0187 5.4938 1780 nan 0.3843 0.0 0.3843 0.0176 0.3843 0.1921 0.3843
0.0218 5.5556 1800 nan 0.3807 0.0 0.3807 0.0175 0.3807 0.1904 0.3807
0.0154 5.6173 1820 nan 0.3602 0.0 0.3602 0.0175 0.3602 0.1801 0.3602
0.021 5.6790 1840 nan 0.3445 0.0 0.3445 0.0175 0.3445 0.1723 0.3445
0.0191 5.7407 1860 nan 0.3290 0.0 0.3290 0.0175 0.3290 0.1645 0.3290
0.0194 5.8025 1880 nan 0.3564 0.0 0.3564 0.0174 0.3564 0.1782 0.3564
0.0121 5.8642 1900 nan 0.3860 0.0 0.3860 0.0175 0.3860 0.1930 0.3860
0.0194 5.9259 1920 nan 0.4023 0.0 0.4023 0.0175 0.4023 0.2011 0.4023
0.0159 5.9877 1940 nan 0.3953 0.0 0.3953 0.0173 0.3953 0.1976 0.3953
0.0142 6.0494 1960 nan 0.4036 0.0 0.4036 0.0173 0.4036 0.2018 0.4036
0.0165 6.1111 1980 nan 0.3593 0.0 0.3593 0.0173 0.3593 0.1796 0.3593
0.0166 6.1728 2000 nan 0.4104 0.0 0.4104 0.0174 0.4104 0.2052 0.4104
0.0164 6.2346 2020 nan 0.3653 0.0 0.3653 0.0174 0.3653 0.1826 0.3653
0.0267 6.2963 2040 nan 0.3582 0.0 0.3582 0.0173 0.3582 0.1791 0.3582
0.0202 6.3580 2060 nan 0.3613 0.0 0.3613 0.0173 0.3613 0.1806 0.3613
0.0134 6.4198 2080 nan 0.4004 0.0 0.4004 0.0174 0.4004 0.2002 0.4004
0.014 6.4815 2100 nan 0.3781 0.0 0.3781 0.0173 0.3781 0.1890 0.3781
0.0197 6.5432 2120 nan 0.3724 0.0 0.3724 0.0173 0.3724 0.1862 0.3724
0.0185 6.6049 2140 nan 0.3306 0.0 0.3306 0.0174 0.3306 0.1653 0.3306
0.0205 6.6667 2160 nan 0.3523 0.0 0.3523 0.0173 0.3523 0.1761 0.3523
0.0152 6.7284 2180 nan 0.3690 0.0 0.3690 0.0172 0.3690 0.1845 0.3690
0.0158 6.7901 2200 0.0172 0.1793 0.3586 0.3586 nan 0.3586 0.0 0.3586
0.0199 6.8519 2220 0.0174 0.2063 0.4126 0.4126 nan 0.4126 0.0 0.4126
0.0186 6.9136 2240 0.0173 0.1691 0.3382 0.3382 nan 0.3382 0.0 0.3382
0.0228 6.9753 2260 0.0173 0.1804 0.3609 0.3609 nan 0.3609 0.0 0.3609
0.0193 7.0370 2280 0.0172 0.1887 0.3775 0.3775 nan 0.3775 0.0 0.3775
0.0161 7.0988 2300 0.0172 0.1900 0.3799 0.3799 nan 0.3799 0.0 0.3799
0.0182 7.1605 2320 0.0172 0.1888 0.3775 0.3775 nan 0.3775 0.0 0.3775
0.0165 7.2222 2340 0.0172 0.2037 0.4075 0.4075 nan 0.4075 0.0 0.4075
0.0197 7.2840 2360 0.0172 0.1900 0.3800 0.3800 nan 0.3800 0.0 0.3800
0.0246 7.3457 2380 0.0172 0.1861 0.3722 0.3722 nan 0.3722 0.0 0.3722
0.0191 7.4074 2400 0.0172 0.1935 0.3870 0.3870 nan 0.3870 0.0 0.3870
0.0157 7.4691 2420 0.0172 0.1913 0.3826 0.3826 nan 0.3826 0.0 0.3826
0.0187 7.5309 2440 0.0172 0.1863 0.3727 0.3727 nan 0.3727 0.0 0.3727
0.0199 7.5926 2460 0.0172 0.1817 0.3634 0.3634 nan 0.3634 0.0 0.3634
0.0226 7.6543 2480 0.0172 0.1944 0.3889 0.3889 nan 0.3889 0.0 0.3889
0.0167 7.7160 2500 0.0172 0.1833 0.3666 0.3666 nan 0.3666 0.0 0.3666
0.017 7.7778 2520 0.0171 0.1924 0.3848 0.3848 nan 0.3848 0.0 0.3848
0.0219 7.8395 2540 0.0171 0.1885 0.3769 0.3769 nan 0.3769 0.0 0.3769
0.0149 7.9012 2560 0.0172 0.2088 0.4177 0.4177 nan 0.4177 0.0 0.4177
0.0137 7.9630 2580 0.0171 0.1932 0.3865 0.3865 nan 0.3865 0.0 0.3865
0.0235 8.0247 2600 0.0171 0.1837 0.3674 0.3674 nan 0.3674 0.0 0.3674
0.0169 8.0864 2620 0.0171 0.2023 0.4046 0.4046 nan 0.4046 0.0 0.4046
0.0183 8.1481 2640 0.0171 0.1876 0.3753 0.3753 nan 0.3753 0.0 0.3753
0.0171 8.2099 2660 0.0171 0.1876 0.3753 0.3753 nan 0.3753 0.0 0.3753
0.0202 8.2716 2680 0.0172 0.2087 0.4173 0.4173 nan 0.4173 0.0 0.4173
0.0229 8.3333 2700 0.0171 0.1799 0.3599 0.3599 nan 0.3599 0.0 0.3599
0.0138 8.3951 2720 0.0171 0.1941 0.3882 0.3882 nan 0.3882 0.0 0.3882
0.0155 8.4568 2740 0.0171 0.2031 0.4063 0.4063 nan 0.4063 0.0 0.4063
0.0171 8.5185 2760 0.0171 0.2038 0.4075 0.4075 nan 0.4075 0.0 0.4075
0.0211 8.5802 2780 0.0170 0.1946 0.3893 0.3893 nan 0.3893 0.0 0.3893
0.0161 8.6420 2800 0.0170 0.1950 0.3900 0.3900 nan 0.3900 0.0 0.3900
0.0175 8.7037 2820 0.0171 0.1946 0.3893 0.3893 nan 0.3893 0.0 0.3893
0.0199 8.7654 2840 0.0171 0.1948 0.3895 0.3895 nan 0.3895 0.0 0.3895
0.0169 8.8272 2860 0.0171 0.2013 0.4027 0.4027 nan 0.4027 0.0 0.4027
0.0213 8.8889 2880 0.0170 0.1964 0.3927 0.3927 nan 0.3927 0.0 0.3927
0.018 8.9506 2900 0.0171 0.1850 0.3700 0.3700 nan 0.3700 0.0 0.3700
0.0195 9.0123 2920 0.0171 0.2041 0.4081 0.4081 nan 0.4081 0.0 0.4081
0.0156 9.0741 2940 0.0171 0.1922 0.3844 0.3844 nan 0.3844 0.0 0.3844
0.0201 9.1358 2960 0.0171 0.1985 0.3971 0.3971 nan 0.3971 0.0 0.3971
0.0223 9.1975 2980 0.0171 0.1970 0.3940 0.3940 nan 0.3940 0.0 0.3940
0.0193 9.2593 3000 0.0170 0.1918 0.3836 0.3836 nan 0.3836 0.0 0.3836
0.0121 9.3210 3020 0.0170 0.1978 0.3956 0.3956 nan 0.3956 0.0 0.3956
0.0183 9.3827 3040 0.0170 0.1922 0.3845 0.3845 nan 0.3845 0.0 0.3845
0.0208 9.4444 3060 0.0170 0.1966 0.3932 0.3932 nan 0.3932 0.0 0.3932
0.0207 9.5062 3080 0.0170 0.1967 0.3933 0.3933 nan 0.3933 0.0 0.3933
0.0194 9.5679 3100 0.0170 0.1922 0.3843 0.3843 nan 0.3843 0.0 0.3843
0.0158 9.6296 3120 0.0170 0.1965 0.3929 0.3929 nan 0.3929 0.0 0.3929
0.0158 9.6914 3140 0.0170 0.1898 0.3796 0.3796 nan 0.3796 0.0 0.3796
0.0215 9.7531 3160 0.0170 0.1922 0.3845 0.3845 nan 0.3845 0.0 0.3845
0.0143 9.8148 3180 0.0170 0.1989 0.3979 0.3979 nan 0.3979 0.0 0.3979
0.0177 9.8765 3200 0.0170 0.1935 0.3870 0.3870 nan 0.3870 0.0 0.3870
0.0159 9.9383 3220 0.0170 0.1911 0.3822 0.3822 nan 0.3822 0.0 0.3822
0.0189 10.0 3240 0.0170 0.1948 0.3896 0.3896 nan 0.3896 0.0 0.3896

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.20.3
Downloads last month
131
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for AmirGenAI/segformer-b0-finetuned-wrinkle

Base model

nvidia/mit-b0
Finetuned
(335)
this model