Edit model card

smids_5x_beit_base_adamax_001_fold4

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9816
  • Accuracy: 0.7917

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.8866 1.0 375 0.8458 0.5233
0.8299 2.0 750 0.7752 0.57
0.7493 3.0 1125 0.7658 0.615
0.7146 4.0 1500 0.7493 0.6417
0.7017 5.0 1875 0.7203 0.6817
0.763 6.0 2250 0.6999 0.68
0.6756 7.0 2625 0.6915 0.6967
0.7133 8.0 3000 0.7086 0.6683
0.6524 9.0 3375 0.6683 0.6883
0.7431 10.0 3750 0.6664 0.7017
0.6733 11.0 4125 0.6953 0.675
0.6611 12.0 4500 0.6840 0.6933
0.6809 13.0 4875 0.6889 0.6717
0.6678 14.0 5250 0.6820 0.6917
0.6888 15.0 5625 0.6381 0.7083
0.6174 16.0 6000 0.6105 0.7417
0.6389 17.0 6375 0.6744 0.7033
0.6613 18.0 6750 0.6040 0.7417
0.6398 19.0 7125 0.6119 0.7317
0.6011 20.0 7500 0.5771 0.7517
0.5356 21.0 7875 0.6886 0.7033
0.5588 22.0 8250 0.6145 0.7433
0.5597 23.0 8625 0.6084 0.7383
0.5734 24.0 9000 0.5790 0.7633
0.5451 25.0 9375 0.5688 0.7567
0.5084 26.0 9750 0.5594 0.765
0.4925 27.0 10125 0.6035 0.7633
0.4449 28.0 10500 0.5736 0.7717
0.4935 29.0 10875 0.5611 0.76
0.518 30.0 11250 0.6001 0.75
0.5374 31.0 11625 0.5726 0.7867
0.4582 32.0 12000 0.5878 0.7717
0.4449 33.0 12375 0.6022 0.77
0.4991 34.0 12750 0.5950 0.7833
0.3652 35.0 13125 0.5638 0.8033
0.4263 36.0 13500 0.5959 0.7883
0.4604 37.0 13875 0.6072 0.8
0.4152 38.0 14250 0.6172 0.8033
0.3735 39.0 14625 0.6726 0.79
0.3648 40.0 15000 0.6751 0.7933
0.3489 41.0 15375 0.6954 0.7833
0.235 42.0 15750 0.7474 0.7767
0.2834 43.0 16125 0.7611 0.7933
0.2126 44.0 16500 0.7672 0.7917
0.2122 45.0 16875 0.8481 0.7683
0.1955 46.0 17250 0.8595 0.795
0.1764 47.0 17625 0.8929 0.7917
0.2086 48.0 18000 0.9496 0.79
0.175 49.0 18375 0.9722 0.7917
0.152 50.0 18750 0.9816 0.7917

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/smids_5x_beit_base_adamax_001_fold4

Finetuned
(281)
this model

Evaluation results