yijisuk's picture
End of training
96506f2 verified
|
raw
history blame
10.1 kB
metadata
license: other
base_model: nvidia/mit-b1
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: segformer-b1-miic-tl
    results: []

segformer-b1-miic-tl

This model is a fine-tuned version of nvidia/mit-b1 on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1915
  • Mean Iou: 0.4765
  • Mean Accuracy: 0.9531
  • Overall Accuracy: 0.9531
  • Accuracy Unlabeled: nan
  • Accuracy Circuit: 0.9531
  • Iou Unlabeled: 0.0
  • Iou Circuit: 0.9531

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Circuit Iou Unlabeled Iou Circuit
0.7961 1.0 20 0.5776 0.3160 0.6320 0.6320 nan 0.6320 0.0 0.6320
0.7261 2.0 40 0.4222 0.4655 0.9310 0.9310 nan 0.9310 0.0 0.9310
0.3132 3.0 60 0.2869 0.4478 0.8956 0.8956 nan 0.8956 0.0 0.8956
0.2224 4.0 80 0.2898 0.4817 0.9635 0.9635 nan 0.9635 0.0 0.9635
0.1641 5.0 100 0.2861 0.4733 0.9466 0.9466 nan 0.9466 0.0 0.9466
0.9802 6.0 120 0.3005 0.4790 0.9581 0.9581 nan 0.9581 0.0 0.9581
0.1633 7.0 140 0.2953 0.4397 0.8794 0.8794 nan 0.8794 0.0 0.8794
0.3674 8.0 160 0.2951 0.4809 0.9619 0.9619 nan 0.9619 0.0 0.9619
0.1632 9.0 180 0.3007 0.4740 0.9480 0.9480 nan 0.9480 0.0 0.9480
0.3719 10.0 200 0.2633 0.4687 0.9374 0.9374 nan 0.9374 0.0 0.9374
0.2061 11.0 220 0.2544 0.4575 0.9150 0.9150 nan 0.9150 0.0 0.9150
0.1756 12.0 240 0.2587 0.4856 0.9711 0.9711 nan 0.9711 0.0 0.9711
0.366 13.0 260 0.2458 0.4883 0.9765 0.9765 nan 0.9765 0.0 0.9765
0.2532 14.0 280 0.2742 0.4771 0.9543 0.9543 nan 0.9543 0.0 0.9543
0.144 15.0 300 0.2424 0.4612 0.9223 0.9223 nan 0.9223 0.0 0.9223
0.1314 16.0 320 0.2130 0.4745 0.9489 0.9489 nan 0.9489 0.0 0.9489
1.4391 17.0 340 0.2156 0.4813 0.9626 0.9626 nan 0.9626 0.0 0.9626
0.211 18.0 360 0.1995 0.4767 0.9533 0.9533 nan 0.9533 0.0 0.9533
0.0792 19.0 380 0.2052 0.4855 0.9710 0.9710 nan 0.9710 0.0 0.9710
1.1 20.0 400 0.1972 0.4712 0.9424 0.9424 nan 0.9424 0.0 0.9424
0.067 21.0 420 0.2015 0.4697 0.9394 0.9394 nan 0.9394 0.0 0.9394
0.1783 22.0 440 0.2100 0.4821 0.9642 0.9642 nan 0.9642 0.0 0.9642
0.1594 23.0 460 0.1989 0.4746 0.9491 0.9491 nan 0.9491 0.0 0.9491
0.2306 24.0 480 0.1957 0.4668 0.9337 0.9337 nan 0.9337 0.0 0.9337
0.9809 25.0 500 0.1971 0.4802 0.9603 0.9603 nan 0.9603 0.0 0.9603
0.1154 26.0 520 0.1957 0.4792 0.9585 0.9585 nan 0.9585 0.0 0.9585
0.2142 27.0 540 0.1945 0.4827 0.9655 0.9655 nan 0.9655 0.0 0.9655
0.177 28.0 560 0.1930 0.4725 0.9451 0.9451 nan 0.9451 0.0 0.9451
0.2003 29.0 580 0.1965 0.4827 0.9654 0.9654 nan 0.9654 0.0 0.9654
0.1977 30.0 600 0.1995 0.4861 0.9722 0.9722 nan 0.9722 0.0 0.9722
0.1671 31.0 620 0.1946 0.4760 0.9520 0.9520 nan 0.9520 0.0 0.9520
0.1449 32.0 640 0.1895 0.4642 0.9285 0.9285 nan 0.9285 0.0 0.9285
0.2587 33.0 660 0.1920 0.4810 0.9619 0.9619 nan 0.9619 0.0 0.9619
1.2053 34.0 680 0.1931 0.4790 0.9579 0.9579 nan 0.9579 0.0 0.9579
0.1107 35.0 700 0.1951 0.4824 0.9647 0.9647 nan 0.9647 0.0 0.9647
0.0821 36.0 720 0.1926 0.4788 0.9577 0.9577 nan 0.9577 0.0 0.9577
0.5034 37.0 740 0.1903 0.4656 0.9311 0.9311 nan 0.9311 0.0 0.9311
0.137 38.0 760 0.1892 0.4684 0.9368 0.9368 nan 0.9368 0.0 0.9368
0.2861 39.0 780 0.1911 0.4762 0.9524 0.9524 nan 0.9524 0.0 0.9524
0.965 40.0 800 0.1928 0.4716 0.9432 0.9432 nan 0.9432 0.0 0.9432
0.138 41.0 820 0.1926 0.4742 0.9483 0.9483 nan 0.9483 0.0 0.9483
0.0291 42.0 840 0.1888 0.4689 0.9378 0.9378 nan 0.9378 0.0 0.9378
0.0624 43.0 860 0.1895 0.4684 0.9369 0.9369 nan 0.9369 0.0 0.9369
0.0611 44.0 880 0.1915 0.4772 0.9545 0.9545 nan 0.9545 0.0 0.9545
0.0322 45.0 900 0.1893 0.4670 0.9340 0.9340 nan 0.9340 0.0 0.9340
0.0927 46.0 920 0.1901 0.4714 0.9428 0.9428 nan 0.9428 0.0 0.9428
0.1752 47.0 940 0.1897 0.4758 0.9516 0.9516 nan 0.9516 0.0 0.9516
0.1343 48.0 960 0.1906 0.4779 0.9559 0.9559 nan 0.9559 0.0 0.9559
0.0765 49.0 980 0.1903 0.4732 0.9464 0.9464 nan 0.9464 0.0 0.9464
0.048 50.0 1000 0.1915 0.4765 0.9531 0.9531 nan 0.9531 0.0 0.9531

Framework versions

  • Transformers 4.36.2
  • Pytorch 1.11.0+cu115
  • Datasets 2.15.0
  • Tokenizers 0.15.0