File size: 8,693 Bytes
4c0a4f2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
---
library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-sidewalk-outputs
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-segments-sidewalk-outputs

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0650
- Mean Iou: 0.5841
- Mean Accuracy: 0.8778
- Overall Accuracy: 0.9588
- Accuracy Background: nan
- Accuracy Ground: 0.9811
- Accuracy Pallet: 0.7744
- Iou Background: 0.0
- Iou Ground: 0.9805
- Iou Pallet: 0.7717

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Ground | Accuracy Pallet | Iou Background | Iou Ground | Iou Pallet |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:---------------:|:--------------:|:----------:|:----------:|
| 0.0421        | 0.1471 | 20   | 0.0709          | 0.5844   | 0.8798        | 0.9604           | nan                 | 0.9826          | 0.7771          | 0.0            | 0.9808     | 0.7724     |
| 0.0276        | 0.2941 | 40   | 0.0624          | 0.6015   | 0.9042        | 0.9646           | nan                 | 0.9813          | 0.8271          | 0.0            | 0.9809     | 0.8236     |
| 0.0211        | 0.4412 | 60   | 0.0691          | 0.5664   | 0.8509        | 0.9552           | nan                 | 0.9840          | 0.7179          | 0.0            | 0.9835     | 0.7157     |
| 0.0235        | 0.5882 | 80   | 0.0621          | 0.5617   | 0.8437        | 0.9565           | nan                 | 0.9876          | 0.6998          | 0.0            | 0.9866     | 0.6984     |
| 0.0271        | 0.7353 | 100  | 0.0602          | 0.5877   | 0.8832        | 0.9567           | nan                 | 0.9770          | 0.7893          | 0.0            | 0.9767     | 0.7863     |
| 0.024         | 0.8824 | 120  | 0.0682          | 0.5923   | 0.8911        | 0.9600           | nan                 | 0.9790          | 0.8033          | 0.0            | 0.9785     | 0.7985     |
| 0.0407        | 1.0294 | 140  | 0.0691          | 0.5892   | 0.8859        | 0.9580           | nan                 | 0.9779          | 0.7940          | 0.0            | 0.9773     | 0.7903     |
| 0.0277        | 1.1765 | 160  | 0.0683          | 0.5788   | 0.8697        | 0.9571           | nan                 | 0.9812          | 0.7582          | 0.0            | 0.9804     | 0.7560     |
| 0.017         | 1.3235 | 180  | 0.0679          | 0.5845   | 0.8789        | 0.9566           | nan                 | 0.9780          | 0.7798          | 0.0            | 0.9770     | 0.7764     |
| 0.0275        | 1.4706 | 200  | 0.0634          | 0.5992   | 0.9014        | 0.9639           | nan                 | 0.9812          | 0.8216          | 0.0            | 0.9800     | 0.8175     |
| 0.0122        | 1.6176 | 220  | 0.0602          | 0.5859   | 0.8813        | 0.9590           | nan                 | 0.9804          | 0.7821          | 0.0            | 0.9794     | 0.7782     |
| 0.0149        | 1.7647 | 240  | 0.0662          | 0.5827   | 0.8757        | 0.9591           | nan                 | 0.9820          | 0.7694          | 0.0            | 0.9817     | 0.7664     |
| 0.0169        | 1.9118 | 260  | 0.0628          | 0.5994   | 0.9019        | 0.9614           | nan                 | 0.9778          | 0.8259          | 0.0            | 0.9776     | 0.8205     |
| 0.0324        | 2.0588 | 280  | 0.0677          | 0.5859   | 0.8809        | 0.9584           | nan                 | 0.9798          | 0.7819          | 0.0            | 0.9792     | 0.7785     |
| 0.0229        | 2.2059 | 300  | 0.0693          | 0.5983   | 0.9003        | 0.9619           | nan                 | 0.9789          | 0.8217          | 0.0            | 0.9784     | 0.8166     |
| 0.0204        | 2.3529 | 320  | 0.0729          | 0.5850   | 0.8792        | 0.9586           | nan                 | 0.9805          | 0.7780          | 0.0            | 0.9800     | 0.7749     |
| 0.0102        | 2.5    | 340  | 0.0655          | 0.5899   | 0.8868        | 0.9603           | nan                 | 0.9806          | 0.7931          | 0.0            | 0.9802     | 0.7895     |
| 0.0235        | 2.6471 | 360  | 0.0682          | 0.5781   | 0.8688        | 0.9556           | nan                 | 0.9795          | 0.7580          | 0.0            | 0.9789     | 0.7552     |
| 0.0239        | 2.7941 | 380  | 0.0633          | 0.5961   | 0.8965        | 0.9615           | nan                 | 0.9794          | 0.8136          | 0.0            | 0.9789     | 0.8093     |
| 0.0305        | 2.9412 | 400  | 0.0593          | 0.5832   | 0.8764        | 0.9593           | nan                 | 0.9822          | 0.7706          | 0.0            | 0.9817     | 0.7678     |
| 0.0183        | 3.0882 | 420  | 0.0600          | 0.5867   | 0.8816        | 0.9613           | nan                 | 0.9832          | 0.7799          | 0.0            | 0.9827     | 0.7775     |
| 0.031         | 3.2353 | 440  | 0.0612          | 0.5933   | 0.8917        | 0.9614           | nan                 | 0.9806          | 0.8029          | 0.0            | 0.9799     | 0.8000     |
| 0.0174        | 3.3824 | 460  | 0.0645          | 0.5836   | 0.8769        | 0.9590           | nan                 | 0.9816          | 0.7722          | 0.0            | 0.9811     | 0.7696     |
| 0.0456        | 3.5294 | 480  | 0.0651          | 0.5770   | 0.8669        | 0.9577           | nan                 | 0.9827          | 0.7512          | 0.0            | 0.9821     | 0.7489     |
| 0.0187        | 3.6765 | 500  | 0.0659          | 0.5831   | 0.8765        | 0.9578           | nan                 | 0.9803          | 0.7727          | 0.0            | 0.9798     | 0.7695     |
| 0.0329        | 3.8235 | 520  | 0.0690          | 0.5787   | 0.8697        | 0.9568           | nan                 | 0.9808          | 0.7587          | 0.0            | 0.9801     | 0.7560     |
| 0.0241        | 3.9706 | 540  | 0.0651          | 0.5847   | 0.8789        | 0.9584           | nan                 | 0.9803          | 0.7774          | 0.0            | 0.9798     | 0.7743     |
| 0.0304        | 4.1176 | 560  | 0.0652          | 0.5871   | 0.8823        | 0.9589           | nan                 | 0.9800          | 0.7846          | 0.0            | 0.9795     | 0.7817     |
| 0.0086        | 4.2647 | 580  | 0.0662          | 0.5851   | 0.8793        | 0.9584           | nan                 | 0.9802          | 0.7784          | 0.0            | 0.9797     | 0.7756     |
| 0.0194        | 4.4118 | 600  | 0.0678          | 0.5889   | 0.8853        | 0.9581           | nan                 | 0.9781          | 0.7925          | 0.0            | 0.9777     | 0.7890     |
| 0.0114        | 4.5588 | 620  | 0.0664          | 0.5877   | 0.8834        | 0.9582           | nan                 | 0.9789          | 0.7880          | 0.0            | 0.9784     | 0.7847     |
| 0.0183        | 4.7059 | 640  | 0.0663          | 0.5843   | 0.8782        | 0.9571           | nan                 | 0.9789          | 0.7776          | 0.0            | 0.9784     | 0.7744     |
| 0.0139        | 4.8529 | 660  | 0.0652          | 0.5872   | 0.8826        | 0.9590           | nan                 | 0.9801          | 0.7852          | 0.0            | 0.9796     | 0.7820     |
| 0.0197        | 5.0    | 680  | 0.0650          | 0.5841   | 0.8778        | 0.9588           | nan                 | 0.9811          | 0.7744          | 0.0            | 0.9805     | 0.7717     |


### Framework versions

- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3