swin-tiny

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the cifar100 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5505
  • Accuracy: 0.8646

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 128
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.8188 1.0 333 2.4232 0.4372
2.0411 2.0 666 1.4235 0.6269
1.7069 3.0 999 1.0558 0.7102
1.5722 4.0 1332 0.8657 0.7504
1.346 5.0 1665 0.7774 0.7721
1.303 6.0 1998 0.7138 0.7874
1.2045 7.0 2331 0.6616 0.7986
1.2482 8.0 2664 0.6210 0.8128
1.1202 9.0 2997 0.5925 0.8185
1.0021 10.0 3330 0.5728 0.8235
1.0662 11.0 3663 0.5637 0.829
1.0263 12.0 3996 0.5442 0.8303
1.0581 13.0 4329 0.5319 0.8379
0.9922 14.0 4662 0.5215 0.8388
0.9643 15.0 4995 0.5144 0.8399
0.9687 16.0 5328 0.5103 0.8413
0.9464 17.0 5661 0.5021 0.8422
0.8651 18.0 5994 0.4867 0.8483
0.8122 19.0 6327 0.4865 0.8457
0.7918 20.0 6660 0.4877 0.8486
0.8994 21.0 6993 0.4836 0.8502
0.8661 22.0 7326 0.4736 0.8538
0.869 23.0 7659 0.4703 0.8528
0.8681 24.0 7992 0.4798 0.8513
0.7693 25.0 8325 0.4680 0.8523
0.8693 26.0 8658 0.4646 0.8579
0.8041 27.0 8991 0.4686 0.8555
0.8036 28.0 9324 0.4609 0.8578
0.7571 29.0 9657 0.4597 0.8616
0.7666 30.0 9990 0.4581 0.8606
0.7226 31.0 10323 0.4569 0.8601
0.7179 32.0 10656 0.4573 0.8628
0.6866 33.0 10989 0.4567 0.8606
0.7002 34.0 11322 0.4672 0.8576
0.7499 35.0 11655 0.4624 0.8611
0.7393 36.0 11988 0.4579 0.8604
0.7393 37.0 12321 0.4560 0.8619
0.7599 38.0 12654 0.4503 0.8637
0.6636 39.0 12987 0.4542 0.8636
0.6759 40.0 13320 0.4483 0.8631
0.7266 41.0 13653 0.4484 0.8636
0.6819 42.0 13986 0.4453 0.8647
0.5912 43.0 14319 0.4493 0.864
0.6803 44.0 14652 0.4453 0.8646
0.6898 45.0 14985 0.4458 0.8628
0.6312 46.0 15318 0.4499 0.8636
0.6972 47.0 15651 0.4494 0.8646
0.616 48.0 15984 0.4525 0.8674
0.6911 49.0 16317 0.4506 0.8637
0.6737 50.0 16650 0.4504 0.8648
0.5573 51.0 16983 0.4542 0.8641
0.6296 52.0 17316 0.4573 0.8626
0.6245 53.0 17649 0.4550 0.8647
0.6018 54.0 17982 0.4509 0.8668
0.6068 55.0 18315 0.4561 0.865
0.6368 56.0 18648 0.4533 0.8666
0.5945 57.0 18981 0.4537 0.8646
0.5379 58.0 19314 0.4583 0.8644
0.6031 59.0 19647 0.4574 0.8647
0.5445 60.0 19980 0.4607 0.8629
0.5589 61.0 20313 0.4619 0.8649
0.5777 62.0 20646 0.4740 0.8626
0.5711 63.0 20979 0.4684 0.8659
0.5369 64.0 21312 0.4655 0.8639
0.5454 65.0 21645 0.4574 0.867
0.5471 66.0 21978 0.4579 0.8655
0.5816 67.0 22311 0.4610 0.8662
0.5262 68.0 22644 0.4631 0.8646
0.5163 69.0 22977 0.4532 0.8677
0.5231 70.0 23310 0.4635 0.867
0.5672 71.0 23643 0.4626 0.8668
0.501 72.0 23976 0.4601 0.8677
0.527 73.0 24309 0.4661 0.8644
0.5618 74.0 24642 0.4677 0.8664
0.5161 75.0 24975 0.4630 0.8691
0.5158 76.0 25308 0.4691 0.8671
0.54 77.0 25641 0.4645 0.8696
0.5352 78.0 25974 0.4805 0.8649
0.5433 79.0 26307 0.4696 0.867
0.5555 80.0 26640 0.4745 0.8657
0.5248 81.0 26973 0.4767 0.8655
0.4648 82.0 27306 0.4730 0.8681
0.5853 83.0 27639 0.4781 0.8656
0.5298 84.0 27972 0.4729 0.869
0.4484 85.0 28305 0.4741 0.869
0.4765 86.0 28638 0.4877 0.8633
0.5409 87.0 28971 0.4807 0.8664
0.4778 88.0 29304 0.4753 0.8677
0.508 89.0 29637 0.4750 0.867
0.4567 90.0 29970 0.4816 0.8681
0.4828 91.0 30303 0.4806 0.8659
0.4357 92.0 30636 0.4770 0.8676
0.5117 93.0 30969 0.4741 0.8714
0.4756 94.0 31302 0.4860 0.8639
0.4575 95.0 31635 0.4855 0.8652
0.4657 96.0 31968 0.4828 0.8677
0.4746 97.0 32301 0.4850 0.8676
0.5466 98.0 32634 0.4890 0.8662
0.49 99.0 32967 0.4879 0.8663
0.4886 100.0 33300 0.4859 0.869
0.4763 101.0 33633 0.4840 0.868
0.5143 102.0 33966 0.4940 0.8673
0.4732 103.0 34299 0.4827 0.8699
0.481 104.0 34632 0.4891 0.8686
0.5015 105.0 34965 0.5004 0.8651
0.4596 106.0 35298 0.4950 0.8669
0.4201 107.0 35631 0.4920 0.866
0.4358 108.0 35964 0.4954 0.8643
0.4588 109.0 36297 0.4923 0.8649
0.4681 110.0 36630 0.4948 0.8654
0.4602 111.0 36963 0.4961 0.8677
0.4871 112.0 37296 0.5005 0.8634
0.4144 113.0 37629 0.4988 0.8657
0.4735 114.0 37962 0.4976 0.8654
0.4621 115.0 38295 0.4937 0.867
0.467 116.0 38628 0.4961 0.8671
0.4328 117.0 38961 0.4987 0.8662
0.3697 118.0 39294 0.4968 0.8667
0.4668 119.0 39627 0.5020 0.8655
0.4095 120.0 39960 0.4992 0.8674
0.4019 121.0 40293 0.5088 0.864
0.3627 122.0 40626 0.5078 0.8658
0.3875 123.0 40959 0.5079 0.8656
0.4696 124.0 41292 0.5006 0.8653
0.4071 125.0 41625 0.5089 0.8638
0.4485 126.0 41958 0.5067 0.8636
0.4565 127.0 42291 0.5060 0.8644
0.4633 128.0 42624 0.5130 0.8637
0.4259 129.0 42957 0.5053 0.867
0.4668 130.0 43290 0.5131 0.8647
0.4916 131.0 43623 0.5055 0.8656
0.4068 132.0 43956 0.5117 0.8669
0.4187 133.0 44289 0.5151 0.8639
0.4197 134.0 44622 0.5068 0.8685
0.3916 135.0 44955 0.5028 0.8684
0.4084 136.0 45288 0.5097 0.8662
0.405 137.0 45621 0.5061 0.8664
0.3752 138.0 45954 0.5129 0.8656
0.4338 139.0 46287 0.5147 0.8654
0.4865 140.0 46620 0.5171 0.8638
0.4771 141.0 46953 0.5166 0.865
0.4824 142.0 47286 0.5209 0.8645
0.4026 143.0 47619 0.5266 0.8649
0.436 144.0 47952 0.5200 0.8658
0.3487 145.0 48285 0.5185 0.8644
0.3615 146.0 48618 0.5202 0.8638
0.4317 147.0 48951 0.5198 0.8658
0.4171 148.0 49284 0.5236 0.8648
0.3833 149.0 49617 0.5228 0.865
0.3934 150.0 49950 0.5223 0.8649
0.4226 151.0 50283 0.5184 0.8657
0.4112 152.0 50616 0.5187 0.8644
0.4202 153.0 50949 0.5191 0.8648
0.4026 154.0 51282 0.5165 0.8669
0.4322 155.0 51615 0.5215 0.8649
0.3763 156.0 51948 0.5235 0.8659
0.4191 157.0 52281 0.5213 0.866
0.3864 158.0 52614 0.5225 0.8662
0.3974 159.0 52947 0.5248 0.8653
0.355 160.0 53280 0.5265 0.8626
0.3511 161.0 53613 0.5227 0.8665
0.3945 162.0 53946 0.5201 0.8662
0.3869 163.0 54279 0.5280 0.8633
0.4148 164.0 54612 0.5258 0.8649
0.3829 165.0 54945 0.5282 0.8652
0.3415 166.0 55278 0.5249 0.8654
0.3599 167.0 55611 0.5252 0.8648
0.3705 168.0 55944 0.5301 0.8645
0.4122 169.0 56277 0.5358 0.8636
0.3473 170.0 56610 0.5298 0.8646
0.3825 171.0 56943 0.5256 0.8643
0.3841 172.0 57276 0.5229 0.8668
0.3543 173.0 57609 0.5270 0.8646
0.4086 174.0 57942 0.5240 0.8656
0.3832 175.0 58275 0.5280 0.8631
0.3515 176.0 58608 0.5302 0.8645
0.3749 177.0 58941 0.5316 0.8645
0.3298 178.0 59274 0.5290 0.8647
0.3758 179.0 59607 0.5272 0.8668
0.31 180.0 59940 0.5314 0.864
0.3521 181.0 60273 0.5259 0.8648
0.3922 182.0 60606 0.5316 0.8638
0.3391 183.0 60939 0.5316 0.8648
0.3646 184.0 61272 0.5329 0.8637
0.4033 185.0 61605 0.5357 0.8662
0.395 186.0 61938 0.5376 0.8634
0.3253 187.0 62271 0.5346 0.8647
0.416 188.0 62604 0.5357 0.8621
0.3494 189.0 62937 0.5332 0.864
0.4009 190.0 63270 0.5364 0.8639
0.3935 191.0 63603 0.5329 0.8668
0.3666 192.0 63936 0.5337 0.8641
0.3474 193.0 64269 0.5321 0.866
0.3873 194.0 64602 0.5336 0.8635
0.3722 195.0 64935 0.5319 0.8645
0.3525 196.0 65268 0.5347 0.8636
0.3561 197.0 65601 0.5407 0.8629
0.3946 198.0 65934 0.5361 0.8643
0.3768 199.0 66267 0.5387 0.8639
0.3328 200.0 66600 0.5325 0.8656
0.3418 201.0 66933 0.5306 0.8676
0.3542 202.0 67266 0.5321 0.8648
0.3688 203.0 67599 0.5430 0.8598
0.3685 204.0 67932 0.5405 0.8629
0.3252 205.0 68265 0.5411 0.8628
0.358 206.0 68598 0.5403 0.8621
0.3086 207.0 68931 0.5399 0.8626
0.3774 208.0 69264 0.5390 0.8628
0.3449 209.0 69597 0.5388 0.865
0.3268 210.0 69930 0.5363 0.8645
0.3549 211.0 70263 0.5437 0.8634
0.3296 212.0 70596 0.5486 0.8627
0.3461 213.0 70929 0.5414 0.8638
0.3292 214.0 71262 0.5445 0.864
0.3622 215.0 71595 0.5438 0.8626
0.3724 216.0 71928 0.5359 0.8665
0.3352 217.0 72261 0.5410 0.8658
0.3484 218.0 72594 0.5407 0.8638
0.3109 219.0 72927 0.5404 0.8653
0.3703 220.0 73260 0.5471 0.8641
0.3318 221.0 73593 0.5432 0.8638
0.3573 222.0 73926 0.5473 0.8631
0.3308 223.0 74259 0.5448 0.8663
0.3329 224.0 74592 0.5445 0.8635
0.3429 225.0 74925 0.5445 0.8631
0.3494 226.0 75258 0.5433 0.8632
0.327 227.0 75591 0.5457 0.8639
0.313 228.0 75924 0.5457 0.8651
0.3344 229.0 76257 0.5421 0.8649
0.2893 230.0 76590 0.5472 0.8645
0.3225 231.0 76923 0.5436 0.8651
0.3662 232.0 77256 0.5428 0.8654
0.3281 233.0 77589 0.5453 0.8654
0.3354 234.0 77922 0.5468 0.8648
0.3238 235.0 78255 0.5501 0.8638
0.292 236.0 78588 0.5419 0.8658
0.3863 237.0 78921 0.5445 0.8637
0.3368 238.0 79254 0.5451 0.8643
0.3011 239.0 79587 0.5459 0.8651
0.2977 240.0 79920 0.5476 0.8651
0.3695 241.0 80253 0.5412 0.8649
0.3683 242.0 80586 0.5449 0.865
0.2971 243.0 80919 0.5490 0.8658
0.3532 244.0 81252 0.5449 0.8666
0.3014 245.0 81585 0.5448 0.8657
0.4048 246.0 81918 0.5468 0.8654
0.3279 247.0 82251 0.5501 0.8661
0.3494 248.0 82584 0.5498 0.8661
0.3543 249.0 82917 0.5513 0.8652
0.3142 250.0 83250 0.5506 0.8645
0.3534 251.0 83583 0.5471 0.8657
0.3703 252.0 83916 0.5475 0.866
0.3331 253.0 84249 0.5462 0.8662
0.3349 254.0 84582 0.5467 0.8649
0.3737 255.0 84915 0.5496 0.8635
0.394 256.0 85248 0.5489 0.8643
0.3394 257.0 85581 0.5514 0.8649
0.2963 258.0 85914 0.5478 0.865
0.3298 259.0 86247 0.5483 0.8639
0.4112 260.0 86580 0.5491 0.8637
0.3627 261.0 86913 0.5482 0.8652
0.2939 262.0 87246 0.5480 0.8649
0.2827 263.0 87579 0.5501 0.8648
0.3001 264.0 87912 0.5491 0.8659
0.3106 265.0 88245 0.5486 0.8662
0.3416 266.0 88578 0.5509 0.8641
0.3277 267.0 88911 0.5526 0.8643
0.3304 268.0 89244 0.5483 0.8645
0.2967 269.0 89577 0.5485 0.8642
0.2956 270.0 89910 0.5506 0.8638
0.335 271.0 90243 0.5497 0.8636
0.3032 272.0 90576 0.5494 0.8642
0.2757 273.0 90909 0.5511 0.8635
0.372 274.0 91242 0.5519 0.8633
0.3262 275.0 91575 0.5502 0.8641
0.2771 276.0 91908 0.5511 0.8629
0.2907 277.0 92241 0.5512 0.8627
0.3239 278.0 92574 0.5513 0.8633
0.2725 279.0 92907 0.5492 0.8638
0.3243 280.0 93240 0.5505 0.8639
0.3407 281.0 93573 0.5508 0.8636
0.3188 282.0 93906 0.5515 0.8644
0.3627 283.0 94239 0.5528 0.8639
0.2879 284.0 94572 0.5518 0.8645
0.288 285.0 94905 0.5524 0.8633
0.3104 286.0 95238 0.5516 0.8645
0.2963 287.0 95571 0.5509 0.8647
0.348 288.0 95904 0.5511 0.8647
0.3232 289.0 96237 0.5504 0.8642
0.286 290.0 96570 0.5499 0.8644
0.2953 291.0 96903 0.5504 0.8638
0.3371 292.0 97236 0.5512 0.8645
0.3631 293.0 97569 0.5504 0.8646
0.3439 294.0 97902 0.5496 0.8648
0.3277 295.0 98235 0.5495 0.8648
0.3115 296.0 98568 0.5503 0.8648
0.3103 297.0 98901 0.5504 0.8643
0.2616 298.0 99234 0.5508 0.8648
0.3072 299.0 99567 0.5505 0.8646
0.3333 300.0 99900 0.5505 0.8646

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu118
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
7
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for jialicheng/cifar100-swin-tiny

Finetuned
(486)
this model