swin-base

This model is a fine-tuned version of microsoft/swin-base-patch4-window7-224 on the cifar100 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3503
  • Accuracy: 0.9226

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 128
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.4475 1.0 333 0.8145 0.8024
1.1153 2.0 666 0.4367 0.8751
0.9219 3.0 999 0.3546 0.8946
0.8802 4.0 1332 0.3216 0.9027
0.7332 5.0 1665 0.3025 0.9069
0.6719 6.0 1998 0.2940 0.9098
0.6996 7.0 2331 0.2785 0.9141
0.7023 8.0 2664 0.2729 0.9165
0.6376 9.0 2997 0.2657 0.9194
0.571 10.0 3330 0.2617 0.9209
0.6006 11.0 3663 0.2636 0.9215
0.5471 12.0 3996 0.2517 0.9239
0.6324 13.0 4329 0.2520 0.9226
0.56 14.0 4662 0.2530 0.924
0.5821 15.0 4995 0.2569 0.9217
0.5203 16.0 5328 0.2476 0.9278
0.5387 17.0 5661 0.2518 0.9264
0.4921 18.0 5994 0.2475 0.9282
0.413 19.0 6327 0.2516 0.9271
0.4612 20.0 6660 0.2538 0.9242
0.4903 21.0 6993 0.2556 0.9262
0.4953 22.0 7326 0.2501 0.9271
0.4922 23.0 7659 0.2486 0.9277
0.4603 24.0 7992 0.2550 0.9234
0.4405 25.0 8325 0.2476 0.9285
0.4867 26.0 8658 0.2482 0.9295
0.4414 27.0 8991 0.2540 0.9267
0.4574 28.0 9324 0.2494 0.9287
0.4109 29.0 9657 0.2533 0.928
0.4433 30.0 9990 0.2571 0.9258
0.4034 31.0 10323 0.2543 0.9265
0.4203 32.0 10656 0.2587 0.9286
0.3942 33.0 10989 0.2555 0.927
0.3991 34.0 11322 0.2564 0.9271
0.4252 35.0 11655 0.2603 0.925
0.4393 36.0 11988 0.2574 0.9288
0.3949 37.0 12321 0.2638 0.9279
0.4458 38.0 12654 0.2582 0.9281
0.3999 39.0 12987 0.2581 0.928
0.3887 40.0 13320 0.2575 0.9301
0.4101 41.0 13653 0.2620 0.9278
0.3987 42.0 13986 0.2627 0.928
0.3514 43.0 14319 0.2660 0.9287
0.3916 44.0 14652 0.2670 0.9287
0.3798 45.0 14985 0.2696 0.9272
0.3352 46.0 15318 0.2726 0.9264
0.3703 47.0 15651 0.2781 0.926
0.3596 48.0 15984 0.2747 0.9274
0.385 49.0 16317 0.2769 0.9257
0.3716 50.0 16650 0.2769 0.9268
0.3305 51.0 16983 0.2756 0.9257
0.354 52.0 17316 0.2759 0.927
0.3543 53.0 17649 0.2825 0.927
0.3333 54.0 17982 0.2766 0.9277
0.3695 55.0 18315 0.2737 0.9293
0.3945 56.0 18648 0.2760 0.9282
0.3315 57.0 18981 0.2749 0.9288
0.3012 58.0 19314 0.2747 0.9283
0.3261 59.0 19647 0.2812 0.9283
0.2817 60.0 19980 0.2754 0.9285
0.3039 61.0 20313 0.2819 0.929
0.306 62.0 20646 0.2816 0.9293
0.3166 63.0 20979 0.2839 0.9265
0.3471 64.0 21312 0.2799 0.9289
0.2996 65.0 21645 0.2848 0.9259
0.3252 66.0 21978 0.2840 0.9278
0.324 67.0 22311 0.2853 0.9273
0.2911 68.0 22644 0.2839 0.9282
0.2852 69.0 22977 0.2892 0.927
0.3154 70.0 23310 0.2902 0.928
0.3402 71.0 23643 0.2874 0.9294
0.3122 72.0 23976 0.2909 0.9258
0.3276 73.0 24309 0.2856 0.9281
0.291 74.0 24642 0.2895 0.9257
0.2939 75.0 24975 0.2922 0.927
0.2853 76.0 25308 0.2956 0.9245
0.2863 77.0 25641 0.2900 0.9281
0.3417 78.0 25974 0.2902 0.927
0.3053 79.0 26307 0.2925 0.9261
0.2954 80.0 26640 0.2942 0.9257
0.2539 81.0 26973 0.3004 0.9252
0.2728 82.0 27306 0.2943 0.9252
0.3537 83.0 27639 0.3002 0.9243
0.2917 84.0 27972 0.2907 0.9259
0.26 85.0 28305 0.2947 0.9265
0.2604 86.0 28638 0.3004 0.9261
0.3287 87.0 28971 0.3003 0.9266
0.3101 88.0 29304 0.3001 0.9277
0.3029 89.0 29637 0.2987 0.9275
0.2824 90.0 29970 0.3001 0.9266
0.2771 91.0 30303 0.3026 0.9261
0.2428 92.0 30636 0.3052 0.9256
0.2784 93.0 30969 0.3012 0.9273
0.2397 94.0 31302 0.2990 0.9275
0.2789 95.0 31635 0.3009 0.9257
0.3029 96.0 31968 0.3023 0.9257
0.2966 97.0 32301 0.3007 0.9273
0.3114 98.0 32634 0.2945 0.9277
0.2892 99.0 32967 0.3028 0.9257
0.248 100.0 33300 0.2971 0.9284
0.3176 101.0 33633 0.3050 0.9264
0.3074 102.0 33966 0.3066 0.929
0.2901 103.0 34299 0.3037 0.9252
0.3027 104.0 34632 0.3039 0.9245
0.3048 105.0 34965 0.2983 0.9287
0.2573 106.0 35298 0.3004 0.928
0.2739 107.0 35631 0.3039 0.9263
0.2491 108.0 35964 0.3016 0.9277
0.2491 109.0 36297 0.3025 0.9272
0.291 110.0 36630 0.3018 0.9257
0.264 111.0 36963 0.3096 0.9255
0.2931 112.0 37296 0.3093 0.9282
0.2407 113.0 37629 0.3106 0.927
0.2583 114.0 37962 0.3116 0.9252
0.2628 115.0 38295 0.3068 0.9254
0.2806 116.0 38628 0.3114 0.9257
0.2441 117.0 38961 0.3072 0.9251
0.2204 118.0 39294 0.3125 0.9258
0.2819 119.0 39627 0.3178 0.9237
0.2466 120.0 39960 0.3189 0.9248
0.2284 121.0 40293 0.3107 0.9267
0.217 122.0 40626 0.3138 0.9263
0.2405 123.0 40959 0.3088 0.9275
0.2972 124.0 41292 0.3067 0.9255
0.246 125.0 41625 0.3120 0.9252
0.273 126.0 41958 0.3165 0.9247
0.2837 127.0 42291 0.3159 0.925
0.2741 128.0 42624 0.3169 0.9256
0.26 129.0 42957 0.3156 0.9251
0.2739 130.0 43290 0.3166 0.9257
0.3104 131.0 43623 0.3238 0.9237
0.264 132.0 43956 0.3164 0.9257
0.2485 133.0 44289 0.3236 0.9232
0.2637 134.0 44622 0.3232 0.9249
0.2211 135.0 44955 0.3191 0.9256
0.2498 136.0 45288 0.3190 0.9251
0.2331 137.0 45621 0.3226 0.9245
0.2247 138.0 45954 0.3241 0.9228
0.2555 139.0 46287 0.3269 0.9224
0.2255 140.0 46620 0.3229 0.9247
0.2909 141.0 46953 0.3212 0.9256
0.2902 142.0 47286 0.3215 0.9231
0.2384 143.0 47619 0.3296 0.9233
0.2538 144.0 47952 0.3270 0.9255
0.2174 145.0 48285 0.3265 0.9249
0.2274 146.0 48618 0.3302 0.9235
0.2354 147.0 48951 0.3300 0.9254
0.2555 148.0 49284 0.3265 0.9236
0.2389 149.0 49617 0.3329 0.9235
0.2441 150.0 49950 0.3222 0.924
0.2419 151.0 50283 0.3287 0.9235
0.2293 152.0 50616 0.3310 0.9221
0.2432 153.0 50949 0.3293 0.9237
0.2255 154.0 51282 0.3338 0.9257
0.2418 155.0 51615 0.3313 0.924
0.2254 156.0 51948 0.3326 0.9245
0.2549 157.0 52281 0.3365 0.924
0.213 158.0 52614 0.3250 0.9257
0.2178 159.0 52947 0.3259 0.9252
0.2127 160.0 53280 0.3294 0.9241
0.2063 161.0 53613 0.3301 0.9239
0.2401 162.0 53946 0.3306 0.9247
0.2198 163.0 54279 0.3323 0.9226
0.2642 164.0 54612 0.3324 0.9239
0.2477 165.0 54945 0.3314 0.9238
0.1936 166.0 55278 0.3324 0.9233
0.2151 167.0 55611 0.3356 0.9229
0.2049 168.0 55944 0.3355 0.9214
0.2356 169.0 56277 0.3369 0.9226
0.2092 170.0 56610 0.3306 0.923
0.2239 171.0 56943 0.3387 0.9237
0.2157 172.0 57276 0.3347 0.9222
0.1877 173.0 57609 0.3423 0.9224
0.2532 174.0 57942 0.3342 0.9231
0.2306 175.0 58275 0.3364 0.9223
0.2247 176.0 58608 0.3376 0.9219
0.2548 177.0 58941 0.3390 0.9217
0.1797 178.0 59274 0.3433 0.9214
0.235 179.0 59607 0.3388 0.9206
0.1707 180.0 59940 0.3376 0.9209
0.195 181.0 60273 0.3384 0.9224
0.2526 182.0 60606 0.3418 0.9215
0.2041 183.0 60939 0.3373 0.921
0.2251 184.0 61272 0.3409 0.922
0.2562 185.0 61605 0.3356 0.9239
0.2225 186.0 61938 0.3390 0.9228
0.1772 187.0 62271 0.3390 0.9219
0.2343 188.0 62604 0.3419 0.9219
0.2086 189.0 62937 0.3400 0.9222
0.3153 190.0 63270 0.3436 0.9203
0.2632 191.0 63603 0.3436 0.9226
0.2191 192.0 63936 0.3463 0.9218
0.1892 193.0 64269 0.3455 0.9226
0.2246 194.0 64602 0.3454 0.9215
0.2485 195.0 64935 0.3412 0.9224
0.2055 196.0 65268 0.3426 0.9209
0.2087 197.0 65601 0.3456 0.92
0.235 198.0 65934 0.3437 0.9218
0.2093 199.0 66267 0.3425 0.9231
0.1899 200.0 66600 0.3433 0.9229
0.231 201.0 66933 0.3435 0.9218
0.2002 202.0 67266 0.3430 0.9219
0.2062 203.0 67599 0.3435 0.921
0.2138 204.0 67932 0.3486 0.9214
0.1742 205.0 68265 0.3454 0.9224
0.2116 206.0 68598 0.3459 0.9212
0.1906 207.0 68931 0.3475 0.9202
0.2253 208.0 69264 0.3447 0.9227
0.1944 209.0 69597 0.3476 0.9209
0.2114 210.0 69930 0.3473 0.9211
0.1959 211.0 70263 0.3463 0.9213
0.1967 212.0 70596 0.3454 0.9235
0.221 213.0 70929 0.3500 0.9239
0.194 214.0 71262 0.3468 0.9231
0.2055 215.0 71595 0.3455 0.9221
0.216 216.0 71928 0.3482 0.9215
0.1887 217.0 72261 0.3495 0.9218
0.2043 218.0 72594 0.3458 0.9237
0.2022 219.0 72927 0.3428 0.9224
0.1834 220.0 73260 0.3419 0.9222
0.1835 221.0 73593 0.3447 0.9231
0.2194 222.0 73926 0.3472 0.9218
0.1775 223.0 74259 0.3466 0.922
0.1781 224.0 74592 0.3505 0.9233
0.2001 225.0 74925 0.3477 0.9226
0.185 226.0 75258 0.3469 0.9229
0.2079 227.0 75591 0.3465 0.9228
0.1709 228.0 75924 0.3485 0.9217
0.2041 229.0 76257 0.3476 0.9213
0.1793 230.0 76590 0.3505 0.9222
0.193 231.0 76923 0.3483 0.9234
0.2192 232.0 77256 0.3490 0.9213
0.2017 233.0 77589 0.3481 0.9219
0.1883 234.0 77922 0.3479 0.9218
0.1682 235.0 78255 0.3476 0.9214
0.1702 236.0 78588 0.3474 0.922
0.2109 237.0 78921 0.3499 0.9221
0.1768 238.0 79254 0.3459 0.9211
0.1731 239.0 79587 0.3480 0.9222
0.1834 240.0 79920 0.3479 0.9216
0.2182 241.0 80253 0.3484 0.9218
0.2084 242.0 80586 0.3515 0.9222
0.2006 243.0 80919 0.3499 0.9223
0.221 244.0 81252 0.3502 0.9223
0.1835 245.0 81585 0.3526 0.9212
0.2469 246.0 81918 0.3473 0.9215
0.1844 247.0 82251 0.3473 0.9228
0.1972 248.0 82584 0.3493 0.9213
0.1821 249.0 82917 0.3503 0.9212
0.2 250.0 83250 0.3518 0.9213
0.1888 251.0 83583 0.3509 0.9219
0.2034 252.0 83916 0.3488 0.9207
0.2062 253.0 84249 0.3464 0.9217
0.1906 254.0 84582 0.3480 0.9224
0.1996 255.0 84915 0.3481 0.9219
0.2447 256.0 85248 0.3485 0.9217
0.1975 257.0 85581 0.3509 0.9215
0.1787 258.0 85914 0.3497 0.9203
0.1599 259.0 86247 0.3530 0.9208
0.2455 260.0 86580 0.3507 0.9206
0.2159 261.0 86913 0.3510 0.9224
0.2032 262.0 87246 0.3502 0.9215
0.1453 263.0 87579 0.3501 0.9225
0.1922 264.0 87912 0.3494 0.9235
0.2038 265.0 88245 0.3481 0.9229
0.1897 266.0 88578 0.3492 0.923
0.1941 267.0 88911 0.3504 0.9237
0.197 268.0 89244 0.3504 0.923
0.1933 269.0 89577 0.3485 0.9227
0.1585 270.0 89910 0.3488 0.9237
0.1994 271.0 90243 0.3488 0.9223
0.1562 272.0 90576 0.3482 0.922
0.1804 273.0 90909 0.3487 0.9214
0.2202 274.0 91242 0.3509 0.9215
0.1804 275.0 91575 0.3502 0.9227
0.1542 276.0 91908 0.3496 0.9229
0.1744 277.0 92241 0.3486 0.9226
0.1779 278.0 92574 0.3483 0.9228
0.1396 279.0 92907 0.3495 0.9228
0.1501 280.0 93240 0.3484 0.9232
0.1808 281.0 93573 0.3503 0.9227
0.1749 282.0 93906 0.3492 0.9218
0.2295 283.0 94239 0.3493 0.9216
0.1695 284.0 94572 0.3491 0.9219
0.1859 285.0 94905 0.3502 0.9222
0.1891 286.0 95238 0.3505 0.9226
0.1681 287.0 95571 0.3513 0.9222
0.1837 288.0 95904 0.3512 0.9217
0.2181 289.0 96237 0.3512 0.9224
0.1637 290.0 96570 0.3514 0.9219
0.1808 291.0 96903 0.3511 0.922
0.1935 292.0 97236 0.3511 0.9226
0.2022 293.0 97569 0.3508 0.9229
0.1708 294.0 97902 0.3505 0.923
0.1924 295.0 98235 0.3508 0.9228
0.1775 296.0 98568 0.3511 0.9224
0.1681 297.0 98901 0.3509 0.9224
0.1528 298.0 99234 0.3504 0.9225
0.1978 299.0 99567 0.3503 0.9226
0.1826 300.0 99900 0.3503 0.9226

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu118
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
6
Safetensors
Model size
86.9M params
Tensor type
I64
·
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for jialicheng/cifar100-swin-base

Finetuned
(51)
this model