File size: 5,734 Bytes
6fbbfd5
c5f6d10
6fbbfd5
 
 
 
 
 
 
 
 
 
 
 
c5f6d10
6fbbfd5
c5f6d10
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6fbbfd5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c5f6d10
6fbbfd5
 
 
 
 
c5f6d10
 
 
 
 
 
 
 
 
 
6fbbfd5
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
base_model: PekingU/rtdetr_r50vd_coco_o365
tags:
- generated_from_trainer
model-index:
- name: rtdetr-r50-cppe5-finetune
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# rtdetr-r50-cppe5-finetune

This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 9.9243
- Map: 0.4532
- Map 50: 0.66
- Map 75: 0.5228
- Map Small: 0.431
- Map Medium: 0.3515
- Map Large: 0.5415
- Mar 1: 0.3644
- Mar 10: 0.6286
- Mar 100: 0.6927
- Mar Small: 0.5962
- Mar Medium: 0.5879
- Mar Large: 0.81
- Map Coverall: 0.4755
- Mar 100 Coverall: 0.7974
- Map Face Shield: 0.4919
- Mar 100 Face Shield: 0.7176
- Map Gloves: 0.3847
- Mar 100 Gloves: 0.6593
- Map Goggles: 0.3127
- Mar 100 Goggles: 0.5793
- Map Mask: 0.6013
- Mar 100 Mask: 0.7098

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 107  | 138.4975        | 0.0441 | 0.0808 | 0.0353 | 0.0       | 0.0247     | 0.056     | 0.0547 | 0.1243 | 0.1461  | 0.0       | 0.082      | 0.2388    | 0.2204       | 0.5937           | 0.0001          | 0.0759              | 0.0        | 0.0201         | 0.0         | 0.02            | 0.0      | 0.0209       |
| No log        | 2.0   | 214  | 23.3748         | 0.0916 | 0.1786 | 0.0747 | 0.0461    | 0.0467     | 0.0912    | 0.1138 | 0.269  | 0.3528  | 0.2138    | 0.2623     | 0.4998    | 0.3271       | 0.6284           | 0.0041          | 0.3076              | 0.0078     | 0.2701         | 0.0034      | 0.2246          | 0.1156   | 0.3333       |
| No log        | 3.0   | 321  | 13.3702         | 0.2057 | 0.3793 | 0.196  | 0.1007    | 0.1548     | 0.3415    | 0.2296 | 0.4115 | 0.4959  | 0.2755    | 0.4268     | 0.7117    | 0.4253       | 0.6986           | 0.0393          | 0.5051              | 0.143      | 0.4183         | 0.1092      | 0.3938          | 0.3119   | 0.4636       |
| No log        | 4.0   | 428  | 12.8750         | 0.2236 | 0.4139 | 0.218  | 0.12      | 0.1699     | 0.4095    | 0.225  | 0.4324 | 0.5089  | 0.296     | 0.4525     | 0.7051    | 0.3626       | 0.6342           | 0.0964          | 0.5253              | 0.117      | 0.3996         | 0.2042      | 0.4631          | 0.3377   | 0.5222       |
| 90.5185       | 5.0   | 535  | 11.9853         | 0.2701 | 0.4731 | 0.2752 | 0.192     | 0.1984     | 0.475     | 0.2573 | 0.4629 | 0.5406  | 0.357     | 0.4739     | 0.7304    | 0.4639       | 0.6973           | 0.1397          | 0.5443              | 0.2001     | 0.5134         | 0.2089      | 0.4354          | 0.3381   | 0.5124       |
| 90.5185       | 6.0   | 642  | 12.6566         | 0.2422 | 0.4501 | 0.2296 | 0.2014    | 0.1863     | 0.425     | 0.2339 | 0.4469 | 0.5379  | 0.3612    | 0.4893     | 0.7289    | 0.3361       | 0.5752           | 0.1231          | 0.5329              | 0.1813     | 0.5272         | 0.2314      | 0.5108          | 0.3393   | 0.5436       |
| 90.5185       | 7.0   | 749  | 12.7385         | 0.2411 | 0.432  | 0.2334 | 0.1769    | 0.1784     | 0.442     | 0.2291 | 0.4407 | 0.5321  | 0.3208    | 0.4863     | 0.7248    | 0.3662       | 0.6527           | 0.115           | 0.5114              | 0.1671     | 0.4969         | 0.2244      | 0.4677          | 0.3328   | 0.532        |
| 90.5185       | 8.0   | 856  | 12.8410         | 0.2614 | 0.4702 | 0.2516 | 0.1796    | 0.1916     | 0.4767    | 0.2389 | 0.451  | 0.5373  | 0.3511    | 0.4776     | 0.7404    | 0.3826       | 0.6739           | 0.1451          | 0.5456              | 0.2148     | 0.5022         | 0.2567      | 0.4646          | 0.3078   | 0.5          |
| 90.5185       | 9.0   | 963  | 13.1283         | 0.1857 | 0.3361 | 0.1772 | 0.1922    | 0.1448     | 0.3403    | 0.2197 | 0.4346 | 0.5488  | 0.368     | 0.5015     | 0.7352    | 0.2542       | 0.6599           | 0.0948          | 0.5392              | 0.0841     | 0.5022         | 0.211       | 0.5062          | 0.2846   | 0.5364       |
| 13.6999       | 10.0  | 1070 | 12.8353         | 0.2457 | 0.4365 | 0.2273 | 0.1837    | 0.1881     | 0.4385    | 0.2388 | 0.4518 | 0.5494  | 0.3529    | 0.4936     | 0.7493    | 0.3722       | 0.6748           | 0.1472          | 0.5671              | 0.1703     | 0.496          | 0.2429      | 0.4831          | 0.296    | 0.5262       |


### Framework versions

- Transformers 4.42.0.dev0
- Pytorch 2.1.0+cu118
- Datasets 2.19.1
- Tokenizers 0.19.1