detr_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8626
- Map: 0.5447
- Map 50: 0.8282
- Map 75: 0.5821
- Map Small: -1.0
- Map Medium: 0.4675
- Map Large: 0.5734
- Mar 1: 0.4327
- Mar 10: 0.7017
- Mar 100: 0.7589
- Mar Small: -1.0
- Mar Medium: 0.6514
- Mar Large: 0.7795
- Map Banana: 0.4399
- Mar 100 Banana: 0.72
- Map Orange: 0.541
- Mar 100 Orange: 0.7738
- Map Apple: 0.6532
- Mar 100 Apple: 0.7829
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 60 | 2.2905 | 0.008 | 0.0222 | 0.006 | -1.0 | 0.0061 | 0.012 | 0.0871 | 0.1937 | 0.303 | -1.0 | 0.2429 | 0.3256 | 0.0066 | 0.15 | 0.0042 | 0.4048 | 0.0133 | 0.3543 |
No log | 2.0 | 120 | 1.9265 | 0.0202 | 0.0629 | 0.0071 | -1.0 | 0.119 | 0.023 | 0.091 | 0.236 | 0.396 | -1.0 | 0.3243 | 0.4106 | 0.0238 | 0.415 | 0.0256 | 0.4071 | 0.0111 | 0.3657 |
No log | 3.0 | 180 | 1.8221 | 0.0309 | 0.0731 | 0.0214 | -1.0 | 0.082 | 0.035 | 0.0877 | 0.241 | 0.4251 | -1.0 | 0.4071 | 0.4275 | 0.0504 | 0.49 | 0.0302 | 0.4738 | 0.0121 | 0.3114 |
No log | 4.0 | 240 | 1.7172 | 0.0253 | 0.0655 | 0.0111 | -1.0 | 0.0988 | 0.0251 | 0.1424 | 0.258 | 0.4915 | -1.0 | 0.4371 | 0.502 | 0.0303 | 0.5225 | 0.0273 | 0.4548 | 0.0183 | 0.4971 |
No log | 5.0 | 300 | 1.5541 | 0.0472 | 0.1085 | 0.0305 | -1.0 | 0.0639 | 0.0526 | 0.1869 | 0.3652 | 0.5653 | -1.0 | 0.4014 | 0.5933 | 0.0326 | 0.535 | 0.0777 | 0.6095 | 0.0313 | 0.5514 |
No log | 6.0 | 360 | 1.5159 | 0.0501 | 0.1145 | 0.0436 | -1.0 | 0.0694 | 0.0556 | 0.2009 | 0.3976 | 0.5542 | -1.0 | 0.38 | 0.5799 | 0.0659 | 0.5725 | 0.0527 | 0.5071 | 0.0318 | 0.5829 |
No log | 7.0 | 420 | 1.4185 | 0.0775 | 0.1777 | 0.0662 | -1.0 | 0.2007 | 0.0751 | 0.2078 | 0.4237 | 0.5944 | -1.0 | 0.5071 | 0.6137 | 0.0647 | 0.585 | 0.1071 | 0.5952 | 0.0608 | 0.6029 |
No log | 8.0 | 480 | 1.2902 | 0.0965 | 0.189 | 0.077 | -1.0 | 0.1555 | 0.1161 | 0.2715 | 0.4469 | 0.64 | -1.0 | 0.5186 | 0.66 | 0.0726 | 0.62 | 0.1498 | 0.6286 | 0.0673 | 0.6714 |
1.5459 | 9.0 | 540 | 1.2497 | 0.1052 | 0.2137 | 0.1115 | -1.0 | 0.2298 | 0.1295 | 0.294 | 0.4625 | 0.6662 | -1.0 | 0.4914 | 0.6987 | 0.0749 | 0.6025 | 0.1614 | 0.6905 | 0.0794 | 0.7057 |
1.5459 | 10.0 | 600 | 1.0677 | 0.141 | 0.2485 | 0.1427 | -1.0 | 0.2822 | 0.1552 | 0.3656 | 0.5481 | 0.7142 | -1.0 | 0.6257 | 0.7329 | 0.0819 | 0.6475 | 0.2168 | 0.7238 | 0.1242 | 0.7714 |
1.5459 | 11.0 | 660 | 1.0572 | 0.1813 | 0.3134 | 0.1988 | -1.0 | 0.2859 | 0.2008 | 0.3533 | 0.5777 | 0.7017 | -1.0 | 0.5886 | 0.72 | 0.1098 | 0.665 | 0.2983 | 0.7143 | 0.136 | 0.7257 |
1.5459 | 12.0 | 720 | 1.0403 | 0.247 | 0.4247 | 0.2529 | -1.0 | 0.3598 | 0.2663 | 0.348 | 0.5748 | 0.7021 | -1.0 | 0.6286 | 0.7157 | 0.1359 | 0.67 | 0.3934 | 0.7333 | 0.2115 | 0.7029 |
1.5459 | 13.0 | 780 | 0.9933 | 0.3205 | 0.5352 | 0.3708 | -1.0 | 0.3999 | 0.3373 | 0.3908 | 0.6208 | 0.7248 | -1.0 | 0.6086 | 0.7447 | 0.1991 | 0.68 | 0.3998 | 0.7429 | 0.3626 | 0.7514 |
1.5459 | 14.0 | 840 | 1.0158 | 0.3865 | 0.6502 | 0.4208 | -1.0 | 0.3726 | 0.4172 | 0.3843 | 0.6447 | 0.7184 | -1.0 | 0.5557 | 0.7445 | 0.2549 | 0.6875 | 0.4506 | 0.7333 | 0.454 | 0.7343 |
1.5459 | 15.0 | 900 | 0.9649 | 0.4519 | 0.6973 | 0.4866 | -1.0 | 0.4641 | 0.4712 | 0.395 | 0.6727 | 0.7373 | -1.0 | 0.6357 | 0.7575 | 0.2713 | 0.67 | 0.5052 | 0.7619 | 0.5792 | 0.78 |
1.5459 | 16.0 | 960 | 0.9148 | 0.491 | 0.7552 | 0.5358 | -1.0 | 0.4674 | 0.5169 | 0.4167 | 0.6903 | 0.7571 | -1.0 | 0.6686 | 0.7776 | 0.3438 | 0.69 | 0.5616 | 0.7786 | 0.5676 | 0.8029 |
0.864 | 17.0 | 1020 | 0.8861 | 0.5232 | 0.7871 | 0.571 | -1.0 | 0.5199 | 0.5463 | 0.4387 | 0.6948 | 0.7541 | -1.0 | 0.68 | 0.771 | 0.4007 | 0.7 | 0.5659 | 0.7595 | 0.6029 | 0.8029 |
0.864 | 18.0 | 1080 | 0.8914 | 0.5014 | 0.7661 | 0.5433 | -1.0 | 0.4449 | 0.5276 | 0.4245 | 0.6954 | 0.7655 | -1.0 | 0.6286 | 0.79 | 0.4006 | 0.715 | 0.4992 | 0.7643 | 0.6043 | 0.8171 |
0.864 | 19.0 | 1140 | 0.8886 | 0.5223 | 0.7763 | 0.5611 | -1.0 | 0.4595 | 0.5492 | 0.4201 | 0.6893 | 0.7473 | -1.0 | 0.6143 | 0.7716 | 0.4002 | 0.69 | 0.5387 | 0.769 | 0.6279 | 0.7829 |
0.864 | 20.0 | 1200 | 0.8973 | 0.5239 | 0.8057 | 0.5726 | -1.0 | 0.4437 | 0.5531 | 0.4317 | 0.6917 | 0.7535 | -1.0 | 0.6371 | 0.7758 | 0.4343 | 0.7125 | 0.5406 | 0.7738 | 0.5966 | 0.7743 |
0.864 | 21.0 | 1260 | 0.8740 | 0.5355 | 0.8126 | 0.5889 | -1.0 | 0.4869 | 0.5605 | 0.4162 | 0.7055 | 0.7633 | -1.0 | 0.6314 | 0.7856 | 0.4039 | 0.7375 | 0.5735 | 0.7667 | 0.6292 | 0.7857 |
0.864 | 22.0 | 1320 | 0.8917 | 0.5212 | 0.7944 | 0.5517 | -1.0 | 0.4609 | 0.549 | 0.423 | 0.6872 | 0.7421 | -1.0 | 0.61 | 0.7657 | 0.4232 | 0.7 | 0.5315 | 0.769 | 0.609 | 0.7571 |
0.864 | 23.0 | 1380 | 0.8508 | 0.5508 | 0.8362 | 0.6164 | -1.0 | 0.4879 | 0.5786 | 0.4278 | 0.6983 | 0.753 | -1.0 | 0.6614 | 0.7723 | 0.4453 | 0.71 | 0.5576 | 0.769 | 0.6494 | 0.78 |
0.864 | 24.0 | 1440 | 0.8769 | 0.5586 | 0.8358 | 0.6156 | -1.0 | 0.4846 | 0.5886 | 0.4471 | 0.7105 | 0.765 | -1.0 | 0.6586 | 0.787 | 0.4598 | 0.705 | 0.5588 | 0.7786 | 0.6572 | 0.8114 |
0.638 | 25.0 | 1500 | 0.8670 | 0.5394 | 0.8271 | 0.5786 | -1.0 | 0.4681 | 0.5667 | 0.425 | 0.7004 | 0.7563 | -1.0 | 0.6514 | 0.7771 | 0.4333 | 0.7075 | 0.5426 | 0.7786 | 0.6422 | 0.7829 |
0.638 | 26.0 | 1560 | 0.8487 | 0.5557 | 0.8355 | 0.6103 | -1.0 | 0.4903 | 0.5829 | 0.4353 | 0.709 | 0.7612 | -1.0 | 0.6586 | 0.7812 | 0.4483 | 0.715 | 0.559 | 0.7857 | 0.6596 | 0.7829 |
0.638 | 27.0 | 1620 | 0.8585 | 0.5484 | 0.8267 | 0.5888 | -1.0 | 0.4735 | 0.5755 | 0.4318 | 0.7106 | 0.7646 | -1.0 | 0.6586 | 0.7848 | 0.4431 | 0.7225 | 0.5435 | 0.7857 | 0.6587 | 0.7857 |
0.638 | 28.0 | 1680 | 0.8668 | 0.5479 | 0.8262 | 0.5865 | -1.0 | 0.471 | 0.5762 | 0.4318 | 0.7051 | 0.763 | -1.0 | 0.6586 | 0.7831 | 0.4414 | 0.72 | 0.5465 | 0.7833 | 0.6556 | 0.7857 |
0.638 | 29.0 | 1740 | 0.8631 | 0.5459 | 0.8282 | 0.5962 | -1.0 | 0.4737 | 0.5737 | 0.4319 | 0.7011 | 0.7598 | -1.0 | 0.6586 | 0.7795 | 0.4394 | 0.72 | 0.5405 | 0.7738 | 0.6579 | 0.7857 |
0.638 | 30.0 | 1800 | 0.8626 | 0.5447 | 0.8282 | 0.5821 | -1.0 | 0.4675 | 0.5734 | 0.4327 | 0.7017 | 0.7589 | -1.0 | 0.6514 | 0.7795 | 0.4399 | 0.72 | 0.541 | 0.7738 | 0.6532 | 0.7829 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for joheras/detr_finetuned_fruits
Base model
hustvl/yolos-tiny