Alex-VisTas commited on
Commit
9c4a148
1 Parent(s): 57f3b43

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +206 -0
README.md ADDED
@@ -0,0 +1,206 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - imagefolder
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: swin-tiny-patch4-window7-224-finetuned-woody_LeftGR_130epochs
11
+ results:
12
+ - task:
13
+ name: Image Classification
14
+ type: image-classification
15
+ dataset:
16
+ name: imagefolder
17
+ type: imagefolder
18
+ config: default
19
+ split: train
20
+ args: default
21
+ metrics:
22
+ - name: Accuracy
23
+ type: accuracy
24
+ value: 0.894374282433984
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # swin-tiny-patch4-window7-224-finetuned-woody_LeftGR_130epochs
31
+
32
+ This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.4197
35
+ - Accuracy: 0.8944
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 5e-05
55
+ - train_batch_size: 32
56
+ - eval_batch_size: 32
57
+ - seed: 42
58
+ - gradient_accumulation_steps: 4
59
+ - total_train_batch_size: 128
60
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
61
+ - lr_scheduler_type: linear
62
+ - lr_scheduler_warmup_ratio: 0.1
63
+ - num_epochs: 130
64
+
65
+ ### Training results
66
+
67
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
68
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
69
+ | 0.6614 | 1.0 | 61 | 0.6404 | 0.6521 |
70
+ | 0.5982 | 2.0 | 122 | 0.5548 | 0.7107 |
71
+ | 0.579 | 3.0 | 183 | 0.5390 | 0.7141 |
72
+ | 0.5621 | 4.0 | 244 | 0.4920 | 0.7623 |
73
+ | 0.5567 | 5.0 | 305 | 0.5375 | 0.7313 |
74
+ | 0.5271 | 6.0 | 366 | 0.5542 | 0.7405 |
75
+ | 0.5312 | 7.0 | 427 | 0.4573 | 0.7876 |
76
+ | 0.5477 | 8.0 | 488 | 0.4540 | 0.7784 |
77
+ | 0.5554 | 9.0 | 549 | 0.4932 | 0.7635 |
78
+ | 0.5247 | 10.0 | 610 | 0.4407 | 0.7968 |
79
+ | 0.5239 | 11.0 | 671 | 0.4479 | 0.7842 |
80
+ | 0.5294 | 12.0 | 732 | 0.4509 | 0.7910 |
81
+ | 0.531 | 13.0 | 793 | 0.4419 | 0.7933 |
82
+ | 0.5493 | 14.0 | 854 | 0.4646 | 0.7784 |
83
+ | 0.4934 | 15.0 | 915 | 0.4310 | 0.7968 |
84
+ | 0.4965 | 16.0 | 976 | 0.4449 | 0.7876 |
85
+ | 0.4946 | 17.0 | 1037 | 0.4342 | 0.8129 |
86
+ | 0.4716 | 18.0 | 1098 | 0.4129 | 0.8140 |
87
+ | 0.4679 | 19.0 | 1159 | 0.4290 | 0.8002 |
88
+ | 0.4799 | 20.0 | 1220 | 0.4356 | 0.7842 |
89
+ | 0.4744 | 21.0 | 1281 | 0.4042 | 0.8094 |
90
+ | 0.4512 | 22.0 | 1342 | 0.3953 | 0.8117 |
91
+ | 0.4633 | 23.0 | 1403 | 0.4157 | 0.7956 |
92
+ | 0.4528 | 24.0 | 1464 | 0.3920 | 0.8094 |
93
+ | 0.4427 | 25.0 | 1525 | 0.3930 | 0.8220 |
94
+ | 0.4238 | 26.0 | 1586 | 0.3891 | 0.8140 |
95
+ | 0.4257 | 27.0 | 1647 | 0.3700 | 0.8255 |
96
+ | 0.4102 | 28.0 | 1708 | 0.4122 | 0.7968 |
97
+ | 0.4505 | 29.0 | 1769 | 0.4210 | 0.7945 |
98
+ | 0.3973 | 30.0 | 1830 | 0.3923 | 0.8197 |
99
+ | 0.3824 | 31.0 | 1891 | 0.3908 | 0.8473 |
100
+ | 0.3887 | 32.0 | 1952 | 0.3897 | 0.8312 |
101
+ | 0.3723 | 33.0 | 2013 | 0.3747 | 0.8381 |
102
+ | 0.3608 | 34.0 | 2074 | 0.3706 | 0.8301 |
103
+ | 0.3718 | 35.0 | 2135 | 0.3937 | 0.8255 |
104
+ | 0.3692 | 36.0 | 2196 | 0.3984 | 0.8037 |
105
+ | 0.3533 | 37.0 | 2257 | 0.3792 | 0.8335 |
106
+ | 0.3625 | 38.0 | 2318 | 0.4070 | 0.8163 |
107
+ | 0.3633 | 39.0 | 2379 | 0.4130 | 0.8232 |
108
+ | 0.3602 | 40.0 | 2440 | 0.3996 | 0.8186 |
109
+ | 0.3557 | 41.0 | 2501 | 0.3756 | 0.8335 |
110
+ | 0.3373 | 42.0 | 2562 | 0.3914 | 0.8220 |
111
+ | 0.3102 | 43.0 | 2623 | 0.4165 | 0.8507 |
112
+ | 0.3135 | 44.0 | 2684 | 0.3852 | 0.8278 |
113
+ | 0.3286 | 45.0 | 2745 | 0.4164 | 0.8450 |
114
+ | 0.316 | 46.0 | 2806 | 0.3498 | 0.8496 |
115
+ | 0.2802 | 47.0 | 2867 | 0.3887 | 0.8462 |
116
+ | 0.3184 | 48.0 | 2928 | 0.3829 | 0.8576 |
117
+ | 0.2785 | 49.0 | 2989 | 0.3627 | 0.8485 |
118
+ | 0.2988 | 50.0 | 3050 | 0.3679 | 0.8370 |
119
+ | 0.267 | 51.0 | 3111 | 0.3528 | 0.8645 |
120
+ | 0.2907 | 52.0 | 3172 | 0.3538 | 0.8519 |
121
+ | 0.2857 | 53.0 | 3233 | 0.3593 | 0.8530 |
122
+ | 0.2651 | 54.0 | 3294 | 0.3732 | 0.8439 |
123
+ | 0.2447 | 55.0 | 3355 | 0.3441 | 0.8542 |
124
+ | 0.2542 | 56.0 | 3416 | 0.3897 | 0.8576 |
125
+ | 0.2634 | 57.0 | 3477 | 0.4082 | 0.8657 |
126
+ | 0.2505 | 58.0 | 3538 | 0.3416 | 0.8657 |
127
+ | 0.2555 | 59.0 | 3599 | 0.3725 | 0.8576 |
128
+ | 0.2466 | 60.0 | 3660 | 0.3496 | 0.8680 |
129
+ | 0.2585 | 61.0 | 3721 | 0.3214 | 0.8783 |
130
+ | 0.235 | 62.0 | 3782 | 0.3584 | 0.8737 |
131
+ | 0.215 | 63.0 | 3843 | 0.3467 | 0.8657 |
132
+ | 0.236 | 64.0 | 3904 | 0.3471 | 0.8829 |
133
+ | 0.2211 | 65.0 | 3965 | 0.3318 | 0.8863 |
134
+ | 0.1989 | 66.0 | 4026 | 0.3645 | 0.8852 |
135
+ | 0.2133 | 67.0 | 4087 | 0.3456 | 0.8898 |
136
+ | 0.2169 | 68.0 | 4148 | 0.3287 | 0.8852 |
137
+ | 0.223 | 69.0 | 4209 | 0.3182 | 0.8921 |
138
+ | 0.2379 | 70.0 | 4270 | 0.3260 | 0.8840 |
139
+ | 0.2149 | 71.0 | 4331 | 0.3230 | 0.8886 |
140
+ | 0.2007 | 72.0 | 4392 | 0.3926 | 0.8760 |
141
+ | 0.2091 | 73.0 | 4453 | 0.4133 | 0.8783 |
142
+ | 0.2229 | 74.0 | 4514 | 0.3867 | 0.8772 |
143
+ | 0.1903 | 75.0 | 4575 | 0.3594 | 0.8840 |
144
+ | 0.2124 | 76.0 | 4636 | 0.3388 | 0.8875 |
145
+ | 0.1999 | 77.0 | 4697 | 0.3305 | 0.8875 |
146
+ | 0.2053 | 78.0 | 4758 | 0.4670 | 0.8840 |
147
+ | 0.1958 | 79.0 | 4819 | 0.3468 | 0.8909 |
148
+ | 0.1839 | 80.0 | 4880 | 0.3902 | 0.8886 |
149
+ | 0.1715 | 81.0 | 4941 | 0.3830 | 0.8875 |
150
+ | 0.1803 | 82.0 | 5002 | 0.3134 | 0.8967 |
151
+ | 0.1803 | 83.0 | 5063 | 0.3935 | 0.8909 |
152
+ | 0.1865 | 84.0 | 5124 | 0.3882 | 0.8863 |
153
+ | 0.1884 | 85.0 | 5185 | 0.3485 | 0.8990 |
154
+ | 0.1663 | 86.0 | 5246 | 0.3667 | 0.8944 |
155
+ | 0.1665 | 87.0 | 5307 | 0.3545 | 0.8932 |
156
+ | 0.1556 | 88.0 | 5368 | 0.3882 | 0.8944 |
157
+ | 0.18 | 89.0 | 5429 | 0.3751 | 0.8898 |
158
+ | 0.1974 | 90.0 | 5490 | 0.3979 | 0.8863 |
159
+ | 0.1622 | 91.0 | 5551 | 0.3623 | 0.8967 |
160
+ | 0.1657 | 92.0 | 5612 | 0.3855 | 0.8978 |
161
+ | 0.1672 | 93.0 | 5673 | 0.3722 | 0.8944 |
162
+ | 0.1807 | 94.0 | 5734 | 0.3994 | 0.8932 |
163
+ | 0.1419 | 95.0 | 5795 | 0.4017 | 0.8863 |
164
+ | 0.178 | 96.0 | 5856 | 0.4168 | 0.8886 |
165
+ | 0.1402 | 97.0 | 5917 | 0.3727 | 0.8944 |
166
+ | 0.1427 | 98.0 | 5978 | 0.3919 | 0.8967 |
167
+ | 0.1318 | 99.0 | 6039 | 0.3843 | 0.8955 |
168
+ | 0.1417 | 100.0 | 6100 | 0.4017 | 0.8898 |
169
+ | 0.1536 | 101.0 | 6161 | 0.3613 | 0.8955 |
170
+ | 0.1631 | 102.0 | 6222 | 0.3377 | 0.9047 |
171
+ | 0.1459 | 103.0 | 6283 | 0.3724 | 0.8967 |
172
+ | 0.1499 | 104.0 | 6344 | 0.3934 | 0.8955 |
173
+ | 0.1572 | 105.0 | 6405 | 0.3368 | 0.8967 |
174
+ | 0.1308 | 106.0 | 6466 | 0.3782 | 0.8990 |
175
+ | 0.1535 | 107.0 | 6527 | 0.3306 | 0.9024 |
176
+ | 0.125 | 108.0 | 6588 | 0.4076 | 0.8898 |
177
+ | 0.1339 | 109.0 | 6649 | 0.3628 | 0.8990 |
178
+ | 0.148 | 110.0 | 6710 | 0.3672 | 0.9013 |
179
+ | 0.1725 | 111.0 | 6771 | 0.4006 | 0.8909 |
180
+ | 0.1326 | 112.0 | 6832 | 0.4117 | 0.8921 |
181
+ | 0.1438 | 113.0 | 6893 | 0.3927 | 0.8978 |
182
+ | 0.1205 | 114.0 | 6954 | 0.3612 | 0.8990 |
183
+ | 0.1531 | 115.0 | 7015 | 0.3594 | 0.8932 |
184
+ | 0.1473 | 116.0 | 7076 | 0.4490 | 0.8875 |
185
+ | 0.1388 | 117.0 | 7137 | 0.3952 | 0.8921 |
186
+ | 0.136 | 118.0 | 7198 | 0.4098 | 0.8921 |
187
+ | 0.1579 | 119.0 | 7259 | 0.3595 | 0.9013 |
188
+ | 0.1359 | 120.0 | 7320 | 0.3970 | 0.8944 |
189
+ | 0.1314 | 121.0 | 7381 | 0.4092 | 0.8932 |
190
+ | 0.1337 | 122.0 | 7442 | 0.4192 | 0.8909 |
191
+ | 0.1538 | 123.0 | 7503 | 0.4154 | 0.8898 |
192
+ | 0.119 | 124.0 | 7564 | 0.4120 | 0.8909 |
193
+ | 0.1353 | 125.0 | 7625 | 0.4060 | 0.8921 |
194
+ | 0.1489 | 126.0 | 7686 | 0.4162 | 0.8909 |
195
+ | 0.1554 | 127.0 | 7747 | 0.4148 | 0.8944 |
196
+ | 0.1558 | 128.0 | 7808 | 0.4169 | 0.8944 |
197
+ | 0.1268 | 129.0 | 7869 | 0.4110 | 0.8955 |
198
+ | 0.1236 | 130.0 | 7930 | 0.4197 | 0.8944 |
199
+
200
+
201
+ ### Framework versions
202
+
203
+ - Transformers 4.23.1
204
+ - Pytorch 1.12.1+cu113
205
+ - Datasets 2.6.1
206
+ - Tokenizers 0.13.1