File size: 11,252 Bytes
621eff8
 
4edaa6a
 
 
 
 
 
 
 
 
621eff8
 
4edaa6a
 
 
 
 
 
 
57f5e18
 
 
 
f0004a7
57f5e18
f0004a7
57f5e18
4edaa6a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57f5e18
 
4edaa6a
 
 
f0004a7
 
 
 
 
 
 
 
57f5e18
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f0004a7
4edaa6a
 
 
f0004a7
4edaa6a
f0004a7
4edaa6a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-wrinkle
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-wrinkle

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the face-wrinkles dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0189
- Mean Iou: 0.2163
- Mean Accuracy: 0.4327
- Overall Accuracy: 0.4327
- Accuracy Unlabeled: nan
- Accuracy Wrinkle: 0.4327
- Iou Unlabeled: 0.0
- Iou Wrinkle: 0.4327

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 10
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Wrinkle | Iou Unlabeled | Iou Wrinkle |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|
| 0.0122        | 0.1786 | 20   | 0.0186          | 0.1899   | 0.3798        | 0.3798           | nan                | 0.3798           | 0.0           | 0.3798      |
| 0.0114        | 0.3571 | 40   | 0.0188          | 0.2007   | 0.4014        | 0.4014           | nan                | 0.4014           | 0.0           | 0.4014      |
| 0.0104        | 0.5357 | 60   | 0.0189          | 0.2127   | 0.4254        | 0.4254           | nan                | 0.4254           | 0.0           | 0.4254      |
| 0.0116        | 0.7143 | 80   | 0.0187          | 0.2215   | 0.4430        | 0.4430           | nan                | 0.4430           | 0.0           | 0.4430      |
| 0.0104        | 0.8929 | 100  | 0.0189          | 0.1815   | 0.3630        | 0.3630           | nan                | 0.3630           | 0.0           | 0.3630      |
| 0.0151        | 1.0714 | 120  | 0.0187          | 0.1949   | 0.3898        | 0.3898           | nan                | 0.3898           | 0.0           | 0.3898      |
| 0.0155        | 1.25   | 140  | 0.0187          | 0.2073   | 0.4147        | 0.4147           | nan                | 0.4147           | 0.0           | 0.4147      |
| 0.0077        | 1.4286 | 160  | 0.0192          | 0.2406   | 0.4812        | 0.4812           | nan                | 0.4812           | 0.0           | 0.4812      |
| 0.0117        | 1.6071 | 180  | 0.0191          | 0.2391   | 0.4782        | 0.4782           | nan                | 0.4782           | 0.0           | 0.4782      |
| 0.0063        | 1.7857 | 200  | 0.0188          | 0.1787   | 0.3573        | 0.3573           | nan                | 0.3573           | 0.0           | 0.3573      |
| 0.01          | 1.9643 | 220  | 0.0185          | 0.2195   | 0.4389        | 0.4389           | nan                | 0.4389           | 0.0           | 0.4389      |
| 0.0109        | 2.1429 | 240  | 0.0191          | 0.1699   | 0.3398        | 0.3398           | nan                | 0.3398           | 0.0           | 0.3398      |
| 0.0104        | 2.3214 | 260  | 0.0191          | 0.2167   | 0.4335        | 0.4335           | nan                | 0.4335           | 0.0           | 0.4335      |
| 0.0145        | 2.5    | 280  | 0.0198          | 0.2604   | 0.5208        | 0.5208           | nan                | 0.5208           | 0.0           | 0.5208      |
| 0.0093        | 2.6786 | 300  | 0.0185          | 0.1963   | 0.3927        | 0.3927           | nan                | 0.3927           | 0.0           | 0.3927      |
| 0.0106        | 2.8571 | 320  | 0.0185          | 0.2080   | 0.4159        | 0.4159           | nan                | 0.4159           | 0.0           | 0.4159      |
| 0.007         | 3.0357 | 340  | 0.0190          | 0.1894   | 0.3787        | 0.3787           | nan                | 0.3787           | 0.0           | 0.3787      |
| 0.01          | 3.2143 | 360  | 0.0189          | 0.2194   | 0.4389        | 0.4389           | nan                | 0.4389           | 0.0           | 0.4389      |
| 0.0118        | 3.3929 | 380  | 0.0186          | 0.2312   | 0.4625        | 0.4625           | nan                | 0.4625           | 0.0           | 0.4625      |
| 0.008         | 3.5714 | 400  | 0.0189          | 0.1746   | 0.3492        | 0.3492           | nan                | 0.3492           | 0.0           | 0.3492      |
| 0.0101        | 3.75   | 420  | 0.0185          | 0.1822   | 0.3644        | 0.3644           | nan                | 0.3644           | 0.0           | 0.3644      |
| 0.0093        | 3.9286 | 440  | 0.0187          | 0.2126   | 0.4252        | 0.4252           | nan                | 0.4252           | 0.0           | 0.4252      |
| 0.008         | 4.1071 | 460  | 0.0186          | 0.2058   | 0.4116        | 0.4116           | nan                | 0.4116           | 0.0           | 0.4116      |
| 0.0134        | 4.2857 | 480  | 0.0187          | 0.2335   | 0.4669        | 0.4669           | nan                | 0.4669           | 0.0           | 0.4669      |
| 0.0119        | 4.4643 | 500  | 0.0191          | 0.1850   | 0.3700        | 0.3700           | nan                | 0.3700           | 0.0           | 0.3700      |
| 0.0064        | 4.6429 | 520  | 0.0187          | 0.1892   | 0.3785        | 0.3785           | nan                | 0.3785           | 0.0           | 0.3785      |
| 0.0087        | 4.8214 | 540  | 0.0190          | 0.2253   | 0.4506        | 0.4506           | nan                | 0.4506           | 0.0           | 0.4506      |
| 0.0122        | 5.0    | 560  | 0.0196          | 0.2598   | 0.5196        | 0.5196           | nan                | 0.5196           | 0.0           | 0.5196      |
| 0.0071        | 5.1786 | 580  | 0.0188          | 0.2224   | 0.4448        | 0.4448           | nan                | 0.4448           | 0.0           | 0.4448      |
| 0.0125        | 5.3571 | 600  | 0.0188          | 0.2051   | 0.4103        | 0.4103           | nan                | 0.4103           | 0.0           | 0.4103      |
| 0.0093        | 5.5357 | 620  | 0.0192          | 0.2410   | 0.4821        | 0.4821           | nan                | 0.4821           | 0.0           | 0.4821      |
| 0.0082        | 5.7143 | 640  | 0.0191          | 0.2291   | 0.4582        | 0.4582           | nan                | 0.4582           | 0.0           | 0.4582      |
| 0.0089        | 5.8929 | 660  | 0.0187          | 0.1993   | 0.3985        | 0.3985           | nan                | 0.3985           | 0.0           | 0.3985      |
| 0.0104        | 6.0714 | 680  | 0.0191          | 0.2049   | 0.4098        | 0.4098           | nan                | 0.4098           | 0.0           | 0.4098      |
| 0.0111        | 6.25   | 700  | 0.0187          | 0.2216   | 0.4431        | 0.4431           | nan                | 0.4431           | 0.0           | 0.4431      |
| 0.0113        | 6.4286 | 720  | 0.0196          | 0.2525   | 0.5050        | 0.5050           | nan                | 0.5050           | 0.0           | 0.5050      |
| 0.0099        | 6.6071 | 740  | 0.0189          | 0.2219   | 0.4439        | 0.4439           | nan                | 0.4439           | 0.0           | 0.4439      |
| 0.0062        | 6.7857 | 760  | 0.0187          | 0.2349   | 0.4699        | 0.4699           | nan                | 0.4699           | 0.0           | 0.4699      |
| 0.0132        | 6.9643 | 780  | 0.0188          | 0.2108   | 0.4217        | 0.4217           | nan                | 0.4217           | 0.0           | 0.4217      |
| 0.0132        | 7.1429 | 800  | 0.0190          | 0.2097   | 0.4194        | 0.4194           | nan                | 0.4194           | 0.0           | 0.4194      |
| 0.0141        | 7.3214 | 820  | 0.0187          | 0.2125   | 0.4251        | 0.4251           | nan                | 0.4251           | 0.0           | 0.4251      |
| 0.0121        | 7.5    | 840  | 0.0189          | 0.2176   | 0.4351        | 0.4351           | nan                | 0.4351           | 0.0           | 0.4351      |
| 0.0099        | 7.6786 | 860  | 0.0187          | 0.2002   | 0.4004        | 0.4004           | nan                | 0.4004           | 0.0           | 0.4004      |
| 0.0168        | 7.8571 | 880  | 0.0188          | 0.2159   | 0.4319        | 0.4319           | nan                | 0.4319           | 0.0           | 0.4319      |
| 0.0064        | 8.0357 | 900  | 0.0188          | 0.2194   | 0.4387        | 0.4387           | nan                | 0.4387           | 0.0           | 0.4387      |
| 0.0121        | 8.2143 | 920  | 0.0191          | 0.2309   | 0.4618        | 0.4618           | nan                | 0.4618           | 0.0           | 0.4618      |
| 0.0133        | 8.3929 | 940  | 0.0189          | 0.2101   | 0.4202        | 0.4202           | nan                | 0.4202           | 0.0           | 0.4202      |
| 0.0105        | 8.5714 | 960  | 0.0190          | 0.2287   | 0.4573        | 0.4573           | nan                | 0.4573           | 0.0           | 0.4573      |
| 0.0092        | 8.75   | 980  | 0.0188          | 0.2178   | 0.4356        | 0.4356           | nan                | 0.4356           | 0.0           | 0.4356      |
| 0.0124        | 8.9286 | 1000 | 0.0191          | 0.2277   | 0.4553        | 0.4553           | nan                | 0.4553           | 0.0           | 0.4553      |
| 0.0108        | 9.1071 | 1020 | 0.0189          | 0.2017   | 0.4033        | 0.4033           | nan                | 0.4033           | 0.0           | 0.4033      |
| 0.0098        | 9.2857 | 1040 | 0.0190          | 0.2271   | 0.4542        | 0.4542           | nan                | 0.4542           | 0.0           | 0.4542      |
| 0.0087        | 9.4643 | 1060 | 0.0189          | 0.2168   | 0.4335        | 0.4335           | nan                | 0.4335           | 0.0           | 0.4335      |
| 0.008         | 9.6429 | 1080 | 0.0189          | 0.2219   | 0.4438        | 0.4438           | nan                | 0.4438           | 0.0           | 0.4438      |
| 0.0071        | 9.8214 | 1100 | 0.0189          | 0.2204   | 0.4407        | 0.4407           | nan                | 0.4407           | 0.0           | 0.4407      |
| 0.0072        | 10.0   | 1120 | 0.0189          | 0.2163   | 0.4327        | 0.4327           | nan                | 0.4327           | 0.0           | 0.4327      |


### Framework versions

- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3