hushem_1x_beit_base_sgd_001_fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3101
  • Accuracy: 0.4390

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.5856 0.2439
1.5446 2.0 12 1.5524 0.2683
1.5446 3.0 18 1.5246 0.3171
1.4921 4.0 24 1.5015 0.3171
1.4491 5.0 30 1.4859 0.3415
1.4491 6.0 36 1.4721 0.3415
1.4253 7.0 42 1.4615 0.3415
1.4253 8.0 48 1.4471 0.3659
1.3656 9.0 54 1.4347 0.3902
1.3889 10.0 60 1.4270 0.3902
1.3889 11.0 66 1.4192 0.4146
1.3303 12.0 72 1.4108 0.4146
1.3303 13.0 78 1.4040 0.4146
1.3227 14.0 84 1.3958 0.4146
1.3003 15.0 90 1.3889 0.4146
1.3003 16.0 96 1.3827 0.4146
1.3072 17.0 102 1.3788 0.3902
1.3072 18.0 108 1.3733 0.4146
1.2978 19.0 114 1.3664 0.4390
1.268 20.0 120 1.3623 0.4390
1.268 21.0 126 1.3569 0.4390
1.265 22.0 132 1.3511 0.4390
1.265 23.0 138 1.3470 0.4634
1.2559 24.0 144 1.3424 0.4390
1.2443 25.0 150 1.3395 0.4146
1.2443 26.0 156 1.3357 0.4390
1.2468 27.0 162 1.3318 0.4390
1.2468 28.0 168 1.3281 0.4390
1.2381 29.0 174 1.3262 0.4390
1.2466 30.0 180 1.3249 0.4146
1.2466 31.0 186 1.3215 0.4390
1.234 32.0 192 1.3185 0.4390
1.234 33.0 198 1.3170 0.4390
1.2144 34.0 204 1.3158 0.4390
1.2407 35.0 210 1.3143 0.4390
1.2407 36.0 216 1.3132 0.4390
1.2238 37.0 222 1.3125 0.4390
1.2238 38.0 228 1.3116 0.4390
1.221 39.0 234 1.3110 0.4390
1.1985 40.0 240 1.3104 0.4390
1.1985 41.0 246 1.3101 0.4390
1.2078 42.0 252 1.3101 0.4390
1.2078 43.0 258 1.3101 0.4390
1.1965 44.0 264 1.3101 0.4390
1.2151 45.0 270 1.3101 0.4390
1.2151 46.0 276 1.3101 0.4390
1.2187 47.0 282 1.3101 0.4390
1.2187 48.0 288 1.3101 0.4390
1.1908 49.0 294 1.3101 0.4390
1.1985 50.0 300 1.3101 0.4390

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
14
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_1x_beit_base_sgd_001_fold5

Finetuned
(297)
this model

Evaluation results