--- library_name: transformers license: apache-2.0 base_model: PekingU/rtdetr_r50vd_coco_o365 tags: - object-detection - vision - generated_from_trainer model-index: - name: rt-detr-finetuned-cppe-5-3k-steps results: [] --- # rt-detr-finetuned-cppe-5-3k-steps This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on the cppe-5 dataset. It achieves the following results on the evaluation set: - Loss: 9.1012 - Map: 0.2813 - Map 50: 0.5271 - Map 75: 0.2685 - Map Small: 0.0879 - Map Medium: 0.2399 - Map Large: 0.4613 - Mar 1: 0.3061 - Mar 10: 0.4664 - Mar 100: 0.5014 - Mar Small: 0.2985 - Mar Medium: 0.4465 - Mar Large: 0.6698 - Map Coverall: 0.4438 - Mar 100 Coverall: 0.6815 - Map Face Shield: 0.2983 - Mar 100 Face Shield: 0.4924 - Map Gloves: 0.2305 - Mar 100 Gloves: 0.4817 - Map Goggles: 0.1591 - Mar 100 Goggles: 0.3969 - Map Mask: 0.275 - Mar 100 Mask: 0.4547 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 30.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:| | 41.0803 | 1.0 | 107 | 14.7670 | 0.0685 | 0.1413 | 0.0589 | 0.0211 | 0.0496 | 0.1045 | 0.0968 | 0.2251 | 0.2568 | 0.127 | 0.201 | 0.3699 | 0.2045 | 0.3964 | 0.0514 | 0.3013 | 0.0134 | 0.1777 | 0.0031 | 0.1938 | 0.0701 | 0.2147 | | 18.7124 | 2.0 | 214 | 11.5242 | 0.1664 | 0.3287 | 0.148 | 0.0451 | 0.1291 | 0.265 | 0.1963 | 0.3451 | 0.3986 | 0.2461 | 0.3493 | 0.5659 | 0.3441 | 0.5757 | 0.1158 | 0.4367 | 0.1193 | 0.3446 | 0.0676 | 0.2754 | 0.1854 | 0.3604 | | 16.6913 | 3.0 | 321 | 12.2486 | 0.1399 | 0.3041 | 0.1087 | 0.029 | 0.1187 | 0.2072 | 0.1663 | 0.3153 | 0.3776 | 0.1937 | 0.3363 | 0.5251 | 0.3001 | 0.5545 | 0.1189 | 0.443 | 0.0832 | 0.3326 | 0.0431 | 0.2092 | 0.1543 | 0.3484 | | 15.7025 | 4.0 | 428 | 11.5488 | 0.1819 | 0.3681 | 0.1619 | 0.0474 | 0.1853 | 0.2567 | 0.2206 | 0.351 | 0.3984 | 0.1976 | 0.3882 | 0.4911 | 0.3217 | 0.5752 | 0.1295 | 0.3709 | 0.1665 | 0.3241 | 0.1069 | 0.3292 | 0.1847 | 0.3924 | | 14.7758 | 5.0 | 535 | 12.9331 | 0.1605 | 0.3126 | 0.1484 | 0.0402 | 0.1512 | 0.2597 | 0.1985 | 0.3135 | 0.3435 | 0.1853 | 0.3037 | 0.4472 | 0.2383 | 0.4365 | 0.14 | 0.3304 | 0.1715 | 0.3402 | 0.0775 | 0.2569 | 0.1754 | 0.3538 | | 14.4905 | 6.0 | 642 | 10.1206 | 0.219 | 0.4286 | 0.1979 | 0.0502 | 0.1962 | 0.3665 | 0.2516 | 0.4058 | 0.4494 | 0.216 | 0.4008 | 0.5944 | 0.404 | 0.673 | 0.1247 | 0.3975 | 0.2031 | 0.4402 | 0.1247 | 0.2985 | 0.2387 | 0.4378 | | 13.4909 | 7.0 | 749 | 9.6988 | 0.245 | 0.4741 | 0.2326 | 0.0588 | 0.2417 | 0.3664 | 0.2647 | 0.4201 | 0.4572 | 0.2248 | 0.4316 | 0.5821 | 0.4571 | 0.6806 | 0.1374 | 0.3405 | 0.1979 | 0.4473 | 0.159 | 0.3831 | 0.2736 | 0.4347 | | 13.2245 | 8.0 | 856 | 9.7353 | 0.2375 | 0.4622 | 0.227 | 0.0528 | 0.2347 | 0.3748 | 0.2658 | 0.4291 | 0.4815 | 0.2621 | 0.4423 | 0.6434 | 0.4127 | 0.6707 | 0.1807 | 0.4696 | 0.2293 | 0.4821 | 0.1382 | 0.3462 | 0.2265 | 0.4391 | | 12.7879 | 9.0 | 963 | 9.2718 | 0.2604 | 0.4832 | 0.2502 | 0.0654 | 0.2323 | 0.4055 | 0.2818 | 0.4304 | 0.4788 | 0.2339 | 0.4387 | 0.6353 | 0.4797 | 0.6959 | 0.1558 | 0.4329 | 0.2371 | 0.4884 | 0.1658 | 0.3354 | 0.2633 | 0.4413 | | 12.2499 | 10.0 | 1070 | 9.5461 | 0.2547 | 0.4956 | 0.2404 | 0.0582 | 0.2249 | 0.418 | 0.2725 | 0.443 | 0.4933 | 0.2713 | 0.4359 | 0.6505 | 0.4224 | 0.7059 | 0.2289 | 0.4544 | 0.2279 | 0.4915 | 0.1501 | 0.3862 | 0.2443 | 0.4284 | | 12.1284 | 11.0 | 1177 | 9.6199 | 0.2611 | 0.5056 | 0.2322 | 0.0731 | 0.2459 | 0.3958 | 0.2646 | 0.4262 | 0.4704 | 0.2817 | 0.4256 | 0.598 | 0.4442 | 0.6568 | 0.211 | 0.4354 | 0.2404 | 0.475 | 0.1468 | 0.3446 | 0.2633 | 0.44 | | 11.9831 | 12.0 | 1284 | 9.3471 | 0.2556 | 0.5007 | 0.2397 | 0.0889 | 0.2476 | 0.3971 | 0.2822 | 0.4573 | 0.5045 | 0.3054 | 0.4747 | 0.6499 | 0.4483 | 0.6806 | 0.1954 | 0.4911 | 0.2084 | 0.4786 | 0.1867 | 0.4292 | 0.2391 | 0.4431 | | 11.8266 | 13.0 | 1391 | 9.5850 | 0.2329 | 0.4647 | 0.2113 | 0.0657 | 0.2009 | 0.3955 | 0.2667 | 0.4193 | 0.4586 | 0.2653 | 0.3996 | 0.6083 | 0.4132 | 0.6329 | 0.1537 | 0.4063 | 0.2098 | 0.4768 | 0.1427 | 0.3569 | 0.2451 | 0.42 | | 11.6433 | 14.0 | 1498 | 9.7106 | 0.2353 | 0.472 | 0.2148 | 0.0697 | 0.2149 | 0.3916 | 0.2711 | 0.4243 | 0.4627 | 0.2665 | 0.4189 | 0.6111 | 0.3916 | 0.6248 | 0.1831 | 0.419 | 0.2157 | 0.4437 | 0.1391 | 0.3785 | 0.2469 | 0.4476 | | 11.8852 | 15.0 | 1605 | 10.8775 | 0.2088 | 0.4137 | 0.1993 | 0.0564 | 0.1788 | 0.3643 | 0.2388 | 0.3642 | 0.3879 | 0.2372 | 0.3324 | 0.5194 | 0.3625 | 0.5649 | 0.1555 | 0.3304 | 0.1954 | 0.4138 | 0.1242 | 0.2615 | 0.2062 | 0.3689 | | 11.4842 | 16.0 | 1712 | 9.3761 | 0.2585 | 0.5013 | 0.2454 | 0.0648 | 0.2309 | 0.4104 | 0.2752 | 0.4329 | 0.4671 | 0.2841 | 0.4057 | 0.6237 | 0.4659 | 0.6676 | 0.2002 | 0.4329 | 0.2453 | 0.4701 | 0.1407 | 0.3308 | 0.2402 | 0.4342 | | 11.1006 | 17.0 | 1819 | 9.2561 | 0.2683 | 0.5134 | 0.2582 | 0.0777 | 0.234 | 0.435 | 0.2884 | 0.4437 | 0.4887 | 0.2774 | 0.4407 | 0.6443 | 0.4587 | 0.6586 | 0.2317 | 0.4519 | 0.2363 | 0.4647 | 0.1587 | 0.4015 | 0.2559 | 0.4667 | | 10.9366 | 18.0 | 1926 | 9.3039 | 0.2626 | 0.4996 | 0.251 | 0.0669 | 0.2413 | 0.4333 | 0.2889 | 0.4399 | 0.4722 | 0.2662 | 0.4241 | 0.6188 | 0.4581 | 0.6572 | 0.1891 | 0.4076 | 0.2421 | 0.467 | 0.1489 | 0.3692 | 0.2748 | 0.46 | | 10.7473 | 19.0 | 2033 | 9.4736 | 0.2649 | 0.5138 | 0.2541 | 0.082 | 0.2386 | 0.4461 | 0.2883 | 0.4318 | 0.4722 | 0.2856 | 0.4252 | 0.6165 | 0.4438 | 0.655 | 0.2526 | 0.4519 | 0.222 | 0.4598 | 0.1568 | 0.3492 | 0.2494 | 0.4453 | | 10.7605 | 20.0 | 2140 | 9.2816 | 0.269 | 0.5104 | 0.2442 | 0.0765 | 0.2403 | 0.4417 | 0.293 | 0.4501 | 0.4914 | 0.2695 | 0.4398 | 0.6531 | 0.4441 | 0.6523 | 0.2429 | 0.4582 | 0.2428 | 0.4951 | 0.1547 | 0.3862 | 0.2606 | 0.4653 | | 10.7865 | 21.0 | 2247 | 9.3265 | 0.2757 | 0.5368 | 0.2621 | 0.0731 | 0.2484 | 0.4379 | 0.2912 | 0.443 | 0.4793 | 0.2896 | 0.441 | 0.6178 | 0.4589 | 0.6455 | 0.2726 | 0.481 | 0.2327 | 0.4768 | 0.1692 | 0.3354 | 0.245 | 0.4578 | | 10.5171 | 22.0 | 2354 | 9.5773 | 0.2554 | 0.506 | 0.2458 | 0.0866 | 0.2131 | 0.435 | 0.2866 | 0.4385 | 0.4752 | 0.3181 | 0.4226 | 0.621 | 0.4187 | 0.6405 | 0.2536 | 0.4633 | 0.2051 | 0.4545 | 0.1539 | 0.3738 | 0.2457 | 0.444 | | 10.7464 | 23.0 | 2461 | 9.4040 | 0.2544 | 0.5064 | 0.2414 | 0.0697 | 0.2089 | 0.4424 | 0.288 | 0.4357 | 0.4732 | 0.2358 | 0.4181 | 0.641 | 0.4118 | 0.6468 | 0.2571 | 0.4709 | 0.1966 | 0.4504 | 0.1574 | 0.36 | 0.2489 | 0.4378 | | 10.5963 | 24.0 | 2568 | 9.1140 | 0.272 | 0.5293 | 0.2615 | 0.0838 | 0.2326 | 0.4477 | 0.2921 | 0.4473 | 0.4877 | 0.3056 | 0.4231 | 0.65 | 0.4342 | 0.6572 | 0.2733 | 0.4772 | 0.226 | 0.4808 | 0.1649 | 0.3631 | 0.2618 | 0.46 | | 10.4877 | 25.0 | 2675 | 9.2811 | 0.2738 | 0.5409 | 0.2496 | 0.0811 | 0.2321 | 0.4614 | 0.2978 | 0.4528 | 0.4859 | 0.2937 | 0.4261 | 0.6431 | 0.4298 | 0.6527 | 0.2947 | 0.5025 | 0.2465 | 0.4714 | 0.1532 | 0.3662 | 0.2445 | 0.4364 | | 10.5136 | 26.0 | 2782 | 9.2285 | 0.2809 | 0.5362 | 0.2521 | 0.0764 | 0.2341 | 0.4694 | 0.3014 | 0.4547 | 0.4955 | 0.2986 | 0.4321 | 0.6618 | 0.4342 | 0.6725 | 0.2799 | 0.4911 | 0.247 | 0.4701 | 0.1726 | 0.3862 | 0.271 | 0.4578 | | 10.4462 | 27.0 | 2889 | 9.1017 | 0.2803 | 0.5419 | 0.2617 | 0.0914 | 0.2296 | 0.4515 | 0.2973 | 0.4663 | 0.5034 | 0.3121 | 0.4374 | 0.6604 | 0.4503 | 0.6842 | 0.2965 | 0.4899 | 0.2318 | 0.4853 | 0.1656 | 0.4077 | 0.2574 | 0.4498 | | 10.3325 | 28.0 | 2996 | 9.0687 | 0.2849 | 0.5344 | 0.256 | 0.0947 | 0.2288 | 0.4799 | 0.3076 | 0.4598 | 0.4947 | 0.3034 | 0.4193 | 0.6639 | 0.4589 | 0.6752 | 0.2833 | 0.4709 | 0.2302 | 0.4754 | 0.1863 | 0.3969 | 0.2658 | 0.4551 | | 10.3327 | 29.0 | 3103 | 9.1673 | 0.2818 | 0.5364 | 0.264 | 0.0932 | 0.2415 | 0.4556 | 0.3083 | 0.4606 | 0.4995 | 0.3186 | 0.432 | 0.659 | 0.4379 | 0.6784 | 0.2937 | 0.4772 | 0.2241 | 0.4647 | 0.18 | 0.4185 | 0.2731 | 0.4587 | | 10.296 | 30.0 | 3210 | 9.1012 | 0.2813 | 0.5271 | 0.2685 | 0.0879 | 0.2399 | 0.4613 | 0.3061 | 0.4664 | 0.5014 | 0.2985 | 0.4465 | 0.6698 | 0.4438 | 0.6815 | 0.2983 | 0.4924 | 0.2305 | 0.4817 | 0.1591 | 0.3969 | 0.275 | 0.4547 | ### Framework versions - Transformers 4.47.0.dev0 - Pytorch 2.5.0+cu118 - Datasets 2.21.0 - Tokenizers 0.20.0