hushem_1x_beit_base_sgd_0001_fold4

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4354
  • Accuracy: 0.2857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.4991 0.2857
1.6033 2.0 12 1.4950 0.2857
1.6033 3.0 18 1.4913 0.2857
1.56 4.0 24 1.4878 0.2857
1.5624 5.0 30 1.4847 0.2857
1.5624 6.0 36 1.4813 0.2857
1.5652 7.0 42 1.4787 0.2857
1.5652 8.0 48 1.4758 0.2857
1.5644 9.0 54 1.4729 0.2857
1.5444 10.0 60 1.4702 0.2857
1.5444 11.0 66 1.4678 0.2857
1.5307 12.0 72 1.4653 0.2857
1.5307 13.0 78 1.4629 0.2857
1.518 14.0 84 1.4606 0.2857
1.5309 15.0 90 1.4585 0.2857
1.5309 16.0 96 1.4564 0.2857
1.5513 17.0 102 1.4547 0.2857
1.5513 18.0 108 1.4530 0.2857
1.5683 19.0 114 1.4515 0.2857
1.533 20.0 120 1.4498 0.2857
1.533 21.0 126 1.4484 0.2857
1.5308 22.0 132 1.4473 0.2857
1.5308 23.0 138 1.4462 0.2857
1.5033 24.0 144 1.4450 0.2857
1.4859 25.0 150 1.4438 0.2857
1.4859 26.0 156 1.4427 0.2857
1.5126 27.0 162 1.4416 0.2857
1.5126 28.0 168 1.4408 0.2857
1.5334 29.0 174 1.4400 0.2857
1.5073 30.0 180 1.4393 0.2857
1.5073 31.0 186 1.4387 0.2857
1.4951 32.0 192 1.4381 0.2857
1.4951 33.0 198 1.4376 0.2857
1.5148 34.0 204 1.4371 0.2857
1.5182 35.0 210 1.4366 0.2857
1.5182 36.0 216 1.4363 0.2857
1.5025 37.0 222 1.4360 0.2857
1.5025 38.0 228 1.4357 0.2857
1.5134 39.0 234 1.4356 0.2857
1.5013 40.0 240 1.4354 0.2857
1.5013 41.0 246 1.4354 0.2857
1.501 42.0 252 1.4354 0.2857
1.501 43.0 258 1.4354 0.2857
1.5486 44.0 264 1.4354 0.2857
1.4725 45.0 270 1.4354 0.2857
1.4725 46.0 276 1.4354 0.2857
1.4918 47.0 282 1.4354 0.2857
1.4918 48.0 288 1.4354 0.2857
1.5174 49.0 294 1.4354 0.2857
1.5195 50.0 300 1.4354 0.2857

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
14
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_1x_beit_base_sgd_0001_fold4

Finetuned
(297)
this model

Evaluation results