File size: 5,500 Bytes
1798bf1
 
 
 
00f074e
 
1798bf1
 
 
 
 
 
 
 
 
 
 
00f074e
1798bf1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
license: other
base_model: nvidia/mit-b5
tags:
- image-segmentation
- vision
- generated_from_trainer
model-index:
- name: new_ecc_segformer
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# new_ecc_segformer

This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the rishitunu/ECC_crackdataset_withsplit dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0663
- Mean Iou: 0.1943
- Mean Accuracy: 0.3915
- Overall Accuracy: 0.3915
- Accuracy Background: nan
- Accuracy Crack: 0.3915
- Iou Background: 0.0
- Iou Crack: 0.3887

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 10000

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Crack | Iou Background | Iou Crack |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------:|:--------------:|:---------:|
| 0.0489        | 1.0   | 438   | 0.0634          | 0.1464   | 0.2933        | 0.2933           | nan                 | 0.2933         | 0.0            | 0.2929    |
| 0.0542        | 2.0   | 876   | 0.0439          | 0.1956   | 0.3917        | 0.3917           | nan                 | 0.3917         | 0.0            | 0.3912    |
| 0.0484        | 3.0   | 1314  | 0.0434          | 0.1719   | 0.3551        | 0.3551           | nan                 | 0.3551         | 0.0            | 0.3439    |
| 0.0539        | 4.0   | 1752  | 0.0447          | 0.1871   | 0.3820        | 0.3820           | nan                 | 0.3820         | 0.0            | 0.3741    |
| 0.0565        | 5.0   | 2190  | 0.0435          | 0.1888   | 0.3937        | 0.3937           | nan                 | 0.3937         | 0.0            | 0.3777    |
| 0.0544        | 6.0   | 2628  | 0.0442          | 0.1904   | 0.3930        | 0.3930           | nan                 | 0.3930         | 0.0            | 0.3808    |
| 0.0421        | 7.0   | 3066  | 0.0449          | 0.2256   | 0.4651        | 0.4651           | nan                 | 0.4651         | 0.0            | 0.4513    |
| 0.0352        | 8.0   | 3504  | 0.0587          | 0.1569   | 0.3165        | 0.3165           | nan                 | 0.3165         | 0.0            | 0.3138    |
| 0.0394        | 9.0   | 3942  | 0.0442          | 0.1842   | 0.3710        | 0.3710           | nan                 | 0.3710         | 0.0            | 0.3684    |
| 0.0445        | 10.0  | 4380  | 0.0609          | 0.1167   | 0.4173        | 0.4173           | nan                 | 0.4173         | 0.0            | 0.2334    |
| 0.0503        | 11.0  | 4818  | 0.0504          | 0.1702   | 0.3714        | 0.3714           | nan                 | 0.3714         | 0.0            | 0.3403    |
| 0.0379        | 12.0  | 5256  | 0.0460          | 0.1903   | 0.3869        | 0.3869           | nan                 | 0.3869         | 0.0            | 0.3807    |
| 0.0405        | 13.0  | 5694  | 0.0452          | 0.2017   | 0.4084        | 0.4084           | nan                 | 0.4084         | 0.0            | 0.4034    |
| 0.0367        | 14.0  | 6132  | 0.0477          | 0.1995   | 0.4060        | 0.4060           | nan                 | 0.4060         | 0.0            | 0.3990    |
| 0.0315        | 15.0  | 6570  | 0.0498          | 0.2073   | 0.4208        | 0.4208           | nan                 | 0.4208         | 0.0            | 0.4147    |
| 0.0244        | 16.0  | 7008  | 0.0486          | 0.1963   | 0.4029        | 0.4029           | nan                 | 0.4029         | 0.0            | 0.3926    |
| 0.031         | 17.0  | 7446  | 0.0568          | 0.1927   | 0.3892        | 0.3892           | nan                 | 0.3892         | 0.0            | 0.3855    |
| 0.0288        | 18.0  | 7884  | 0.0560          | 0.2033   | 0.4092        | 0.4092           | nan                 | 0.4092         | 0.0            | 0.4067    |
| 0.0354        | 19.0  | 8322  | 0.0613          | 0.2007   | 0.4056        | 0.4056           | nan                 | 0.4056         | 0.0            | 0.4013    |
| 0.0315        | 20.0  | 8760  | 0.0605          | 0.1865   | 0.3752        | 0.3752           | nan                 | 0.3752         | 0.0            | 0.3731    |
| 0.0343        | 21.0  | 9198  | 0.0653          | 0.1991   | 0.4019        | 0.4019           | nan                 | 0.4019         | 0.0            | 0.3981    |
| 0.0327        | 22.0  | 9636  | 0.0660          | 0.1945   | 0.3924        | 0.3924           | nan                 | 0.3924         | 0.0            | 0.3891    |
| 0.0252        | 22.83 | 10000 | 0.0663          | 0.1943   | 0.3915        | 0.3915           | nan                 | 0.3915         | 0.0            | 0.3887    |


### Framework versions

- Transformers 4.34.1
- Pytorch 2.1.0+cpu
- Datasets 2.14.6
- Tokenizers 0.14.1