sumeyya commited on
Commit
e6b9bcb
1 Parent(s): 846779d

End of training

Browse files
README.md CHANGED
@@ -1,199 +1,275 @@
1
  ---
2
  library_name: transformers
3
- tags: []
 
 
 
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  library_name: transformers
3
+ license: other
4
+ base_model: nvidia/mit-b0
5
+ tags:
6
+ - vision
7
+ - image-segmentation
8
+ - generated_from_trainer
9
+ model-index:
10
+ - name: segformer-b0-finetuned-Eduardo-food103-GOOGLE100
11
+ results: []
12
  ---
13
 
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # segformer-b0-finetuned-Eduardo-food103-GOOGLE100
18
+
19
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the EduardoPacheco/FoodSeg103 dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 2.1878
22
+ - Mean Iou: 0.0726
23
+ - Mean Accuracy: 0.1651
24
+ - Overall Accuracy: 0.2308
25
+ - Accuracy Background: nan
26
+ - Accuracy Candy: nan
27
+ - Accuracy Egg tart: nan
28
+ - Accuracy French fries: 0.0
29
+ - Accuracy Chocolate: 0.0
30
+ - Accuracy Biscuit: 0.0
31
+ - Accuracy Popcorn: nan
32
+ - Accuracy Pudding: nan
33
+ - Accuracy Ice cream: 0.0
34
+ - Accuracy Cheese butter: 0.0
35
+ - Accuracy Cake: 0.0
36
+ - Accuracy Wine: nan
37
+ - Accuracy Milkshake: nan
38
+ - Accuracy Coffee: nan
39
+ - Accuracy Juice: nan
40
+ - Accuracy Milk: nan
41
+ - Accuracy Tea: nan
42
+ - Accuracy Almond: nan
43
+ - Accuracy Red beans: nan
44
+ - Accuracy Cashew: nan
45
+ - Accuracy Dried cranberries: nan
46
+ - Accuracy Soy: nan
47
+ - Accuracy Walnut: nan
48
+ - Accuracy Peanut: nan
49
+ - Accuracy Egg: 0.0
50
+ - Accuracy Apple: nan
51
+ - Accuracy Date: nan
52
+ - Accuracy Apricot: nan
53
+ - Accuracy Avocado: 0.0
54
+ - Accuracy Banana: 0.0
55
+ - Accuracy Strawberry: 0.0028
56
+ - Accuracy Cherry: nan
57
+ - Accuracy Blueberry: nan
58
+ - Accuracy Raspberry: nan
59
+ - Accuracy Mango: nan
60
+ - Accuracy Olives: nan
61
+ - Accuracy Peach: nan
62
+ - Accuracy Lemon: 0.0
63
+ - Accuracy Pear: nan
64
+ - Accuracy Fig: nan
65
+ - Accuracy Pineapple: 0.0
66
+ - Accuracy Grape: nan
67
+ - Accuracy Kiwi: nan
68
+ - Accuracy Melon: nan
69
+ - Accuracy Orange: 0.0129
70
+ - Accuracy Watermelon: nan
71
+ - Accuracy Steak: 0.2722
72
+ - Accuracy Pork: 0.0
73
+ - Accuracy Chicken duck: 0.4405
74
+ - Accuracy Sausage: nan
75
+ - Accuracy Fried meat: 0.0
76
+ - Accuracy Lamb: nan
77
+ - Accuracy Sauce: 0.0336
78
+ - Accuracy Crab: nan
79
+ - Accuracy Fish: 0.0
80
+ - Accuracy Shellfish: nan
81
+ - Accuracy Shrimp: nan
82
+ - Accuracy Soup: 0.0
83
+ - Accuracy Bread: 0.9251
84
+ - Accuracy Corn: 0.8612
85
+ - Accuracy Hamburg: nan
86
+ - Accuracy Pizza: nan
87
+ - Accuracy hanamaki baozi: nan
88
+ - Accuracy Wonton dumplings: nan
89
+ - Accuracy Pasta: nan
90
+ - Accuracy Noodles: nan
91
+ - Accuracy Rice: 0.9597
92
+ - Accuracy Pie: 0.0803
93
+ - Accuracy Tofu: nan
94
+ - Accuracy Eggplant: nan
95
+ - Accuracy Potato: 0.0025
96
+ - Accuracy Garlic: nan
97
+ - Accuracy Cauliflower: 0.0
98
+ - Accuracy Tomato: 0.9197
99
+ - Accuracy Kelp: nan
100
+ - Accuracy Seaweed: nan
101
+ - Accuracy Spring onion: nan
102
+ - Accuracy Rape: nan
103
+ - Accuracy Ginger: nan
104
+ - Accuracy Okra: nan
105
+ - Accuracy Lettuce: 0.0
106
+ - Accuracy Pumpkin: nan
107
+ - Accuracy Cucumber: 0.0
108
+ - Accuracy White radish: nan
109
+ - Accuracy Carrot: 0.7713
110
+ - Accuracy Asparagus: 0.0
111
+ - Accuracy Bamboo shoots: nan
112
+ - Accuracy Broccoli: 0.8238
113
+ - Accuracy Celery stick: 0.0
114
+ - Accuracy Cilantro mint: 0.0042
115
+ - Accuracy Snow peas: nan
116
+ - Accuracy cabbage: nan
117
+ - Accuracy Bean sprouts: nan
118
+ - Accuracy Onion: 0.0
119
+ - Accuracy Pepper: nan
120
+ - Accuracy Green beans: nan
121
+ - Accuracy French beans: 0.0
122
+ - Accuracy King oyster mushroom: nan
123
+ - Accuracy Shiitake: nan
124
+ - Accuracy Enoki mushroom: nan
125
+ - Accuracy Oyster mushroom: nan
126
+ - Accuracy White button mushroom: 0.0
127
+ - Accuracy Salad: nan
128
+ - Accuracy Other ingredients: nan
129
+ - Iou Background: 0.0
130
+ - Iou Candy: nan
131
+ - Iou Egg tart: nan
132
+ - Iou French fries: 0.0
133
+ - Iou Chocolate: 0.0
134
+ - Iou Biscuit: 0.0
135
+ - Iou Popcorn: nan
136
+ - Iou Pudding: nan
137
+ - Iou Ice cream: 0.0
138
+ - Iou Cheese butter: 0.0
139
+ - Iou Cake: 0.0
140
+ - Iou Wine: nan
141
+ - Iou Milkshake: nan
142
+ - Iou Coffee: nan
143
+ - Iou Juice: 0.0
144
+ - Iou Milk: nan
145
+ - Iou Tea: nan
146
+ - Iou Almond: nan
147
+ - Iou Red beans: nan
148
+ - Iou Cashew: nan
149
+ - Iou Dried cranberries: nan
150
+ - Iou Soy: nan
151
+ - Iou Walnut: nan
152
+ - Iou Peanut: nan
153
+ - Iou Egg: 0.0
154
+ - Iou Apple: nan
155
+ - Iou Date: nan
156
+ - Iou Apricot: nan
157
+ - Iou Avocado: 0.0
158
+ - Iou Banana: 0.0
159
+ - Iou Strawberry: 0.0028
160
+ - Iou Cherry: nan
161
+ - Iou Blueberry: nan
162
+ - Iou Raspberry: nan
163
+ - Iou Mango: nan
164
+ - Iou Olives: nan
165
+ - Iou Peach: nan
166
+ - Iou Lemon: 0.0
167
+ - Iou Pear: nan
168
+ - Iou Fig: nan
169
+ - Iou Pineapple: 0.0
170
+ - Iou Grape: nan
171
+ - Iou Kiwi: nan
172
+ - Iou Melon: nan
173
+ - Iou Orange: 0.0124
174
+ - Iou Watermelon: nan
175
+ - Iou Steak: 0.1972
176
+ - Iou Pork: 0.0
177
+ - Iou Chicken duck: 0.1551
178
+ - Iou Sausage: nan
179
+ - Iou Fried meat: 0.0
180
+ - Iou Lamb: nan
181
+ - Iou Sauce: 0.0217
182
+ - Iou Crab: nan
183
+ - Iou Fish: 0.0
184
+ - Iou Shellfish: nan
185
+ - Iou Shrimp: nan
186
+ - Iou Soup: 0.0
187
+ - Iou Bread: 0.3244
188
+ - Iou Corn: 0.8612
189
+ - Iou Hamburg: nan
190
+ - Iou Pizza: nan
191
+ - Iou hanamaki baozi: nan
192
+ - Iou Wonton dumplings: nan
193
+ - Iou Pasta: nan
194
+ - Iou Noodles: nan
195
+ - Iou Rice: 0.1760
196
+ - Iou Pie: 0.0180
197
+ - Iou Tofu: nan
198
+ - Iou Eggplant: nan
199
+ - Iou Potato: 0.0011
200
+ - Iou Garlic: nan
201
+ - Iou Cauliflower: 0.0
202
+ - Iou Tomato: 0.1271
203
+ - Iou Kelp: nan
204
+ - Iou Seaweed: nan
205
+ - Iou Spring onion: nan
206
+ - Iou Rape: nan
207
+ - Iou Ginger: nan
208
+ - Iou Okra: nan
209
+ - Iou Lettuce: 0.0
210
+ - Iou Pumpkin: nan
211
+ - Iou Cucumber: 0.0
212
+ - Iou White radish: nan
213
+ - Iou Carrot: 0.6337
214
+ - Iou Asparagus: 0.0
215
+ - Iou Bamboo shoots: nan
216
+ - Iou Broccoli: 0.3690
217
+ - Iou Celery stick: 0.0
218
+ - Iou Cilantro mint: 0.0042
219
+ - Iou Snow peas: nan
220
+ - Iou cabbage: nan
221
+ - Iou Bean sprouts: nan
222
+ - Iou Onion: 0.0
223
+ - Iou Pepper: 0.0
224
+ - Iou Green beans: nan
225
+ - Iou French beans: 0.0
226
+ - Iou King oyster mushroom: nan
227
+ - Iou Shiitake: nan
228
+ - Iou Enoki mushroom: nan
229
+ - Iou Oyster mushroom: nan
230
+ - Iou White button mushroom: 0.0
231
+ - Iou Salad: nan
232
+ - Iou Other ingredients: nan
233
+
234
+ ## Model description
235
+
236
+ More information needed
237
+
238
+ ## Intended uses & limitations
239
+
240
+ More information needed
241
+
242
+ ## Training and evaluation data
243
+
244
+ More information needed
245
+
246
+ ## Training procedure
247
+
248
+ ### Training hyperparameters
249
+
250
+ The following hyperparameters were used during training:
251
+ - learning_rate: 6e-05
252
+ - train_batch_size: 8
253
+ - eval_batch_size: 8
254
+ - seed: 42
255
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
256
+ - lr_scheduler_type: linear
257
+ - num_epochs: 50
258
+
259
+ ### Training results
260
+
261
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Candy | Accuracy Egg tart | Accuracy French fries | Accuracy Chocolate | Accuracy Biscuit | Accuracy Popcorn | Accuracy Pudding | Accuracy Ice cream | Accuracy Cheese butter | Accuracy Cake | Accuracy Wine | Accuracy Milkshake | Accuracy Coffee | Accuracy Juice | Accuracy Milk | Accuracy Tea | Accuracy Almond | Accuracy Red beans | Accuracy Cashew | Accuracy Dried cranberries | Accuracy Soy | Accuracy Walnut | Accuracy Peanut | Accuracy Egg | Accuracy Apple | Accuracy Date | Accuracy Apricot | Accuracy Avocado | Accuracy Banana | Accuracy Strawberry | Accuracy Cherry | Accuracy Blueberry | Accuracy Raspberry | Accuracy Mango | Accuracy Olives | Accuracy Peach | Accuracy Lemon | Accuracy Pear | Accuracy Fig | Accuracy Pineapple | Accuracy Grape | Accuracy Kiwi | Accuracy Melon | Accuracy Orange | Accuracy Watermelon | Accuracy Steak | Accuracy Pork | Accuracy Chicken duck | Accuracy Sausage | Accuracy Fried meat | Accuracy Lamb | Accuracy Sauce | Accuracy Crab | Accuracy Fish | Accuracy Shellfish | Accuracy Shrimp | Accuracy Soup | Accuracy Bread | Accuracy Corn | Accuracy Hamburg | Accuracy Pizza | Accuracy hanamaki baozi | Accuracy Wonton dumplings | Accuracy Pasta | Accuracy Noodles | Accuracy Rice | Accuracy Pie | Accuracy Tofu | Accuracy Eggplant | Accuracy Potato | Accuracy Garlic | Accuracy Cauliflower | Accuracy Tomato | Accuracy Kelp | Accuracy Seaweed | Accuracy Spring onion | Accuracy Rape | Accuracy Ginger | Accuracy Okra | Accuracy Lettuce | Accuracy Pumpkin | Accuracy Cucumber | Accuracy White radish | Accuracy Carrot | Accuracy Asparagus | Accuracy Bamboo shoots | Accuracy Broccoli | Accuracy Celery stick | Accuracy Cilantro mint | Accuracy Snow peas | Accuracy cabbage | Accuracy Bean sprouts | Accuracy Onion | Accuracy Pepper | Accuracy Green beans | Accuracy French beans | Accuracy King oyster mushroom | Accuracy Shiitake | Accuracy Enoki mushroom | Accuracy Oyster mushroom | Accuracy White button mushroom | Accuracy Salad | Accuracy Other ingredients | Iou Background | Iou Candy | Iou Egg tart | Iou French fries | Iou Chocolate | Iou Biscuit | Iou Popcorn | Iou Pudding | Iou Ice cream | Iou Cheese butter | Iou Cake | Iou Wine | Iou Milkshake | Iou Coffee | Iou Juice | Iou Milk | Iou Tea | Iou Almond | Iou Red beans | Iou Cashew | Iou Dried cranberries | Iou Soy | Iou Walnut | Iou Peanut | Iou Egg | Iou Apple | Iou Date | Iou Apricot | Iou Avocado | Iou Banana | Iou Strawberry | Iou Cherry | Iou Blueberry | Iou Raspberry | Iou Mango | Iou Olives | Iou Peach | Iou Lemon | Iou Pear | Iou Fig | Iou Pineapple | Iou Grape | Iou Kiwi | Iou Melon | Iou Orange | Iou Watermelon | Iou Steak | Iou Pork | Iou Chicken duck | Iou Sausage | Iou Fried meat | Iou Lamb | Iou Sauce | Iou Crab | Iou Fish | Iou Shellfish | Iou Shrimp | Iou Soup | Iou Bread | Iou Corn | Iou Hamburg | Iou Pizza | Iou hanamaki baozi | Iou Wonton dumplings | Iou Pasta | Iou Noodles | Iou Rice | Iou Pie | Iou Tofu | Iou Eggplant | Iou Potato | Iou Garlic | Iou Cauliflower | Iou Tomato | Iou Kelp | Iou Seaweed | Iou Spring onion | Iou Rape | Iou Ginger | Iou Okra | Iou Lettuce | Iou Pumpkin | Iou Cucumber | Iou White radish | Iou Carrot | Iou Asparagus | Iou Bamboo shoots | Iou Broccoli | Iou Celery stick | Iou Cilantro mint | Iou Snow peas | Iou cabbage | Iou Bean sprouts | Iou Onion | Iou Pepper | Iou Green beans | Iou French beans | Iou King oyster mushroom | Iou Shiitake | Iou Enoki mushroom | Iou Oyster mushroom | Iou White button mushroom | Iou Salad | Iou Other ingredients |
262
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------:|:-----------------:|:---------------------:|:------------------:|:----------------:|:----------------:|:----------------:|:------------------:|:----------------------:|:-------------:|:-------------:|:------------------:|:---------------:|:--------------:|:-------------:|:------------:|:---------------:|:------------------:|:---------------:|:--------------------------:|:------------:|:---------------:|:---------------:|:------------:|:--------------:|:-------------:|:----------------:|:----------------:|:---------------:|:-------------------:|:---------------:|:------------------:|:------------------:|:--------------:|:---------------:|:--------------:|:--------------:|:-------------:|:------------:|:------------------:|:--------------:|:-------------:|:--------------:|:---------------:|:-------------------:|:--------------:|:-------------:|:---------------------:|:----------------:|:-------------------:|:-------------:|:--------------:|:-------------:|:-------------:|:------------------:|:---------------:|:-------------:|:--------------:|:-------------:|:----------------:|:--------------:|:------------------------:|:-------------------------:|:--------------:|:----------------:|:-------------:|:------------:|:-------------:|:-----------------:|:---------------:|:---------------:|:--------------------:|:---------------:|:-------------:|:----------------:|:---------------------:|:-------------:|:---------------:|:-------------:|:----------------:|:----------------:|:-----------------:|:---------------------:|:---------------:|:------------------:|:----------------------:|:-----------------:|:---------------------:|:----------------------:|:------------------:|:-----------------:|:---------------------:|:--------------:|:---------------:|:--------------------:|:---------------------:|:-----------------------------:|:-----------------:|:-----------------------:|:------------------------:|:------------------------------:|:--------------:|:--------------------------:|:--------------:|:---------:|:------------:|:----------------:|:-------------:|:-----------:|:-----------:|:-----------:|:-------------:|:-----------------:|:--------:|:--------:|:-------------:|:----------:|:---------:|:--------:|:-------:|:----------:|:-------------:|:----------:|:---------------------:|:-------:|:----------:|:----------:|:-------:|:---------:|:--------:|:-----------:|:-----------:|:----------:|:--------------:|:----------:|:-------------:|:-------------:|:---------:|:----------:|:---------:|:---------:|:--------:|:-------:|:-------------:|:---------:|:--------:|:---------:|:----------:|:--------------:|:---------:|:--------:|:----------------:|:-----------:|:--------------:|:--------:|:---------:|:--------:|:--------:|:-------------:|:----------:|:--------:|:---------:|:--------:|:-----------:|:---------:|:-------------------:|:--------------------:|:---------:|:-----------:|:--------:|:-------:|:--------:|:------------:|:----------:|:----------:|:---------------:|:----------:|:--------:|:-----------:|:----------------:|:--------:|:----------:|:--------:|:-----------:|:-----------:|:------------:|:----------------:|:----------:|:-------------:|:-----------------:|:------------:|:----------------:|:-----------------:|:-------------:|:------------:|:----------------:|:---------:|:----------:|:---------------:|:----------------:|:------------------------:|:------------:|:------------------:|:-------------------:|:-------------------------:|:---------:|:---------------------:|
263
+ | 3.1829 | 10.0 | 100 | 2.9969 | 0.0442 | 0.1289 | 0.1903 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0011 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0149 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0439 | nan | 0.0011 | 0.0 | 0.2629 | nan | 0.0 | nan | 0.0226 | nan | 0.0 | nan | nan | 0.0 | 0.9498 | 0.7303 | nan | nan | nan | nan | nan | nan | 0.5457 | 0.0050 | nan | nan | 0.0 | nan | 0.0 | 0.5701 | nan | nan | nan | nan | nan | nan | 0.0120 | nan | 0.0 | nan | 0.5662 | 0.0 | nan | 0.7966 | 0.0 | 0.0018 | nan | nan | nan | 0.0 | nan | nan | 0.2437 | nan | nan | nan | nan | 0.0021 | nan | nan | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0011 | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0143 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0167 | nan | 0.0008 | 0.0 | 0.0953 | 0.0 | 0.0 | nan | 0.0210 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.1913 | 0.7303 | nan | nan | nan | nan | nan | 0.0 | 0.0686 | 0.0032 | nan | nan | 0.0 | nan | 0.0 | 0.1095 | nan | 0.0 | nan | nan | nan | nan | 0.0074 | nan | 0.0 | nan | 0.4240 | 0.0 | nan | 0.3995 | 0.0 | 0.0018 | nan | nan | nan | 0.0 | 0.0 | 0.0 | 0.0352 | nan | nan | nan | 0.0 | 0.0020 | 0.0 | nan |
264
+ | 2.4226 | 20.0 | 200 | 2.4832 | 0.0572 | 0.1463 | 0.2036 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0038 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0006 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0517 | nan | 0.0 | 0.0 | 0.3696 | nan | 0.0 | nan | 0.0362 | nan | 0.0 | nan | nan | 0.0 | 0.9217 | 0.8128 | nan | nan | nan | nan | nan | nan | 0.9160 | 0.0170 | nan | nan | 0.0045 | nan | 0.0 | 0.8978 | nan | nan | nan | nan | nan | nan | 0.0277 | nan | 0.0 | nan | 0.5100 | 0.0 | nan | 0.8429 | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | 0.0005 | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0038 | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0006 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0501 | nan | 0.0 | 0.0 | 0.1509 | nan | 0.0 | nan | 0.0257 | nan | 0.0 | nan | nan | 0.0 | 0.2285 | 0.8128 | nan | nan | nan | nan | nan | nan | 0.0974 | 0.0074 | nan | nan | 0.0045 | nan | 0.0 | 0.1040 | nan | nan | nan | nan | nan | nan | 0.0170 | nan | 0.0 | nan | 0.3959 | 0.0 | nan | 0.3894 | 0.0 | 0.0 | nan | nan | nan | 0.0 | 0.0 | nan | 0.0001 | nan | nan | nan | nan | 0.0 | nan | nan |
265
+ | 2.0416 | 30.0 | 300 | 2.3190 | 0.0654 | 0.1581 | 0.2218 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0008 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0011 | nan | 0.0561 | 0.0 | 0.4236 | nan | 0.0 | nan | 0.0523 | nan | 0.0 | nan | nan | 0.0 | 0.9272 | 0.8698 | nan | nan | nan | nan | nan | nan | 0.9557 | 0.0533 | nan | nan | 0.0029 | nan | 0.0 | 0.9382 | nan | nan | nan | nan | nan | nan | 0.0539 | nan | 0.0 | nan | 0.6930 | 0.0 | nan | 0.8183 | 0.0 | 0.0036 | nan | nan | nan | 0.0 | nan | nan | 0.0001 | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0008 | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0011 | nan | 0.0524 | 0.0 | 0.1551 | nan | 0.0 | nan | 0.0389 | nan | 0.0 | nan | nan | 0.0 | 0.3134 | 0.8698 | nan | nan | nan | nan | nan | nan | 0.1152 | 0.0125 | nan | nan | 0.0019 | nan | 0.0 | 0.1085 | nan | nan | nan | nan | nan | nan | 0.0408 | nan | 0.0 | nan | 0.5407 | 0.0 | nan | 0.3600 | 0.0 | 0.0036 | nan | nan | nan | 0.0 | 0.0 | nan | 0.0000 | nan | nan | nan | nan | 0.0 | nan | nan |
266
+ | 1.8426 | 40.0 | 400 | 2.2506 | 0.0690 | 0.1601 | 0.2263 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0033 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0001 | nan | 0.1714 | 0.0 | 0.4262 | nan | 0.0 | nan | 0.0335 | nan | 0.0 | nan | nan | 0.0 | 0.9367 | 0.8517 | nan | nan | nan | nan | nan | nan | 0.9650 | 0.0358 | nan | nan | 0.0 | nan | 0.0 | 0.9186 | nan | nan | nan | nan | nan | nan | 0.0 | nan | 0.0 | nan | 0.7250 | 0.0 | nan | 0.8542 | 0.0 | 0.0030 | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0032 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0001 | nan | 0.1422 | 0.0 | 0.1508 | nan | 0.0 | nan | 0.0229 | nan | 0.0 | nan | nan | 0.0 | 0.2999 | 0.8517 | nan | nan | nan | nan | nan | nan | 0.1834 | 0.0081 | nan | nan | 0.0 | nan | 0.0 | 0.1171 | nan | nan | nan | nan | nan | nan | 0.0 | nan | 0.0 | nan | 0.6041 | 0.0 | nan | 0.3724 | 0.0 | 0.0030 | nan | nan | nan | 0.0 | 0.0 | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan |
267
+ | 1.7476 | 50.0 | 500 | 2.1878 | 0.0726 | 0.1651 | 0.2308 | nan | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0028 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0129 | nan | 0.2722 | 0.0 | 0.4405 | nan | 0.0 | nan | 0.0336 | nan | 0.0 | nan | nan | 0.0 | 0.9251 | 0.8612 | nan | nan | nan | nan | nan | nan | 0.9597 | 0.0803 | nan | nan | 0.0025 | nan | 0.0 | 0.9197 | nan | nan | nan | nan | nan | nan | 0.0 | nan | 0.0 | nan | 0.7713 | 0.0 | nan | 0.8238 | 0.0 | 0.0042 | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | 0.0 | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | 0.0 | 0.0028 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | 0.0124 | nan | 0.1972 | 0.0 | 0.1551 | nan | 0.0 | nan | 0.0217 | nan | 0.0 | nan | nan | 0.0 | 0.3244 | 0.8612 | nan | nan | nan | nan | nan | nan | 0.1760 | 0.0180 | nan | nan | 0.0011 | nan | 0.0 | 0.1271 | nan | nan | nan | nan | nan | nan | 0.0 | nan | 0.0 | nan | 0.6337 | 0.0 | nan | 0.3690 | 0.0 | 0.0042 | nan | nan | nan | 0.0 | 0.0 | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan |
268
+
269
+
270
+ ### Framework versions
271
+
272
+ - Transformers 4.46.3
273
+ - Pytorch 2.5.1+cu121
274
+ - Datasets 3.2.0
275
+ - Tokenizers 0.20.3
config.json ADDED
@@ -0,0 +1,282 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "background",
32
+ "1": "candy",
33
+ "2": "egg tart",
34
+ "3": "french fries",
35
+ "4": "chocolate",
36
+ "5": "biscuit",
37
+ "6": "popcorn",
38
+ "7": "pudding",
39
+ "8": "ice cream",
40
+ "9": "cheese butter",
41
+ "10": "cake",
42
+ "11": "wine",
43
+ "12": "milkshake",
44
+ "13": "coffee",
45
+ "14": "juice",
46
+ "15": "milk",
47
+ "16": "tea",
48
+ "17": "almond",
49
+ "18": "red beans",
50
+ "19": "cashew",
51
+ "20": "dried cranberries",
52
+ "21": "soy",
53
+ "22": "walnut",
54
+ "23": "peanut",
55
+ "24": "egg",
56
+ "25": "apple",
57
+ "26": "date",
58
+ "27": "apricot",
59
+ "28": "avocado",
60
+ "29": "banana",
61
+ "30": "strawberry",
62
+ "31": "cherry",
63
+ "32": "blueberry",
64
+ "33": "raspberry",
65
+ "34": "mango",
66
+ "35": "olives",
67
+ "36": "peach",
68
+ "37": "lemon",
69
+ "38": "pear",
70
+ "39": "fig",
71
+ "40": "pineapple",
72
+ "41": "grape",
73
+ "42": "kiwi",
74
+ "43": "melon",
75
+ "44": "orange",
76
+ "45": "watermelon",
77
+ "46": "steak",
78
+ "47": "pork",
79
+ "48": "chicken duck",
80
+ "49": "sausage",
81
+ "50": "fried meat",
82
+ "51": "lamb",
83
+ "52": "sauce",
84
+ "53": "crab",
85
+ "54": "fish",
86
+ "55": "shellfish",
87
+ "56": "shrimp",
88
+ "57": "soup",
89
+ "58": "bread",
90
+ "59": "corn",
91
+ "60": "hamburg",
92
+ "61": "pizza",
93
+ "62": " hanamaki baozi",
94
+ "63": "wonton dumplings",
95
+ "64": "pasta",
96
+ "65": "noodles",
97
+ "66": "rice",
98
+ "67": "pie",
99
+ "68": "tofu",
100
+ "69": "eggplant",
101
+ "70": "potato",
102
+ "71": "garlic",
103
+ "72": "cauliflower",
104
+ "73": "tomato",
105
+ "74": "kelp",
106
+ "75": "seaweed",
107
+ "76": "spring onion",
108
+ "77": "rape",
109
+ "78": "ginger",
110
+ "79": "okra",
111
+ "80": "lettuce",
112
+ "81": "pumpkin",
113
+ "82": "cucumber",
114
+ "83": "white radish",
115
+ "84": "carrot",
116
+ "85": "asparagus",
117
+ "86": "bamboo shoots",
118
+ "87": "broccoli",
119
+ "88": "celery stick",
120
+ "89": "cilantro mint",
121
+ "90": "snow peas",
122
+ "91": " cabbage",
123
+ "92": "bean sprouts",
124
+ "93": "onion",
125
+ "94": "pepper",
126
+ "95": "green beans",
127
+ "96": "French beans",
128
+ "97": "king oyster mushroom",
129
+ "98": "shiitake",
130
+ "99": "enoki mushroom",
131
+ "100": "oyster mushroom",
132
+ "101": "white button mushroom",
133
+ "102": "salad",
134
+ "103": "other ingredients"
135
+ },
136
+ "image_size": 224,
137
+ "initializer_range": 0.02,
138
+ "label2id": {
139
+ " cabbage": 91,
140
+ " hanamaki baozi": 62,
141
+ "French beans": 96,
142
+ "almond": 17,
143
+ "apple": 25,
144
+ "apricot": 27,
145
+ "asparagus": 85,
146
+ "avocado": 28,
147
+ "background": 0,
148
+ "bamboo shoots": 86,
149
+ "banana": 29,
150
+ "bean sprouts": 92,
151
+ "biscuit": 5,
152
+ "blueberry": 32,
153
+ "bread": 58,
154
+ "broccoli": 87,
155
+ "cake": 10,
156
+ "candy": 1,
157
+ "carrot": 84,
158
+ "cashew": 19,
159
+ "cauliflower": 72,
160
+ "celery stick": 88,
161
+ "cheese butter": 9,
162
+ "cherry": 31,
163
+ "chicken duck": 48,
164
+ "chocolate": 4,
165
+ "cilantro mint": 89,
166
+ "coffee": 13,
167
+ "corn": 59,
168
+ "crab": 53,
169
+ "cucumber": 82,
170
+ "date": 26,
171
+ "dried cranberries": 20,
172
+ "egg": 24,
173
+ "egg tart": 2,
174
+ "eggplant": 69,
175
+ "enoki mushroom": 99,
176
+ "fig": 39,
177
+ "fish": 54,
178
+ "french fries": 3,
179
+ "fried meat": 50,
180
+ "garlic": 71,
181
+ "ginger": 78,
182
+ "grape": 41,
183
+ "green beans": 95,
184
+ "hamburg": 60,
185
+ "ice cream": 8,
186
+ "juice": 14,
187
+ "kelp": 74,
188
+ "king oyster mushroom": 97,
189
+ "kiwi": 42,
190
+ "lamb": 51,
191
+ "lemon": 37,
192
+ "lettuce": 80,
193
+ "mango": 34,
194
+ "melon": 43,
195
+ "milk": 15,
196
+ "milkshake": 12,
197
+ "noodles": 65,
198
+ "okra": 79,
199
+ "olives": 35,
200
+ "onion": 93,
201
+ "orange": 44,
202
+ "other ingredients": 103,
203
+ "oyster mushroom": 100,
204
+ "pasta": 64,
205
+ "peach": 36,
206
+ "peanut": 23,
207
+ "pear": 38,
208
+ "pepper": 94,
209
+ "pie": 67,
210
+ "pineapple": 40,
211
+ "pizza": 61,
212
+ "popcorn": 6,
213
+ "pork": 47,
214
+ "potato": 70,
215
+ "pudding": 7,
216
+ "pumpkin": 81,
217
+ "rape": 77,
218
+ "raspberry": 33,
219
+ "red beans": 18,
220
+ "rice": 66,
221
+ "salad": 102,
222
+ "sauce": 52,
223
+ "sausage": 49,
224
+ "seaweed": 75,
225
+ "shellfish": 55,
226
+ "shiitake": 98,
227
+ "shrimp": 56,
228
+ "snow peas": 90,
229
+ "soup": 57,
230
+ "soy": 21,
231
+ "spring onion": 76,
232
+ "steak": 46,
233
+ "strawberry": 30,
234
+ "tea": 16,
235
+ "tofu": 68,
236
+ "tomato": 73,
237
+ "walnut": 22,
238
+ "watermelon": 45,
239
+ "white button mushroom": 101,
240
+ "white radish": 83,
241
+ "wine": 11,
242
+ "wonton dumplings": 63
243
+ },
244
+ "layer_norm_eps": 1e-06,
245
+ "mlp_ratios": [
246
+ 4,
247
+ 4,
248
+ 4,
249
+ 4
250
+ ],
251
+ "model_type": "segformer",
252
+ "num_attention_heads": [
253
+ 1,
254
+ 2,
255
+ 5,
256
+ 8
257
+ ],
258
+ "num_channels": 3,
259
+ "num_encoder_blocks": 4,
260
+ "patch_sizes": [
261
+ 7,
262
+ 3,
263
+ 3,
264
+ 3
265
+ ],
266
+ "reshape_last_stage": true,
267
+ "semantic_loss_ignore_index": 255,
268
+ "sr_ratios": [
269
+ 8,
270
+ 4,
271
+ 2,
272
+ 1
273
+ ],
274
+ "strides": [
275
+ 4,
276
+ 2,
277
+ 2,
278
+ 2
279
+ ],
280
+ "torch_dtype": "float32",
281
+ "transformers_version": "4.46.3"
282
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d75ff8e50123b3175acc5343cdfb0454c3301d0a7aa3849023752f2b75ef053f
3
+ size 14989656
runs/Dec16_12-20-59_ebd935bb5f12/events.out.tfevents.1734351702.ebd935bb5f12.183.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:14cda9b123a4bd02368b68c55fe4c1295281311d6e8458ad5de2442998f377d8
3
+ size 74690
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b9bf9e25b5a8f8b5a431fa93a87c4c1c518a5e52d77c0b436c6184102165e8dc
3
+ size 5368