realFake-img / README.md
itsLeen's picture
Model save
b57aad7 verified
|
raw
history blame
3.84 kB
metadata
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: realFake-img
    results: []

realFake-img

This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0988
  • Accuracy: 0.9785

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.2578 0.2525 100 0.1594 0.9418
0.0944 0.5051 200 0.2243 0.9373
0.1747 0.7576 300 0.2472 0.9293
0.1328 1.0101 400 0.1774 0.9338
0.1918 1.2626 500 0.1282 0.9570
0.169 1.5152 600 0.2247 0.9346
0.2595 1.7677 700 0.1785 0.9445
0.0911 2.0202 800 0.1353 0.9534
0.0548 2.2727 900 0.1998 0.9472
0.1399 2.5253 1000 0.1971 0.9445
0.2001 2.7778 1100 0.2479 0.9373
0.0976 3.0303 1200 0.1601 0.9499
0.1291 3.2828 1300 0.1607 0.9588
0.0721 3.5354 1400 0.1822 0.9588
0.0592 3.7879 1500 0.1255 0.9624
0.0964 4.0404 1600 0.1620 0.9543
0.0738 4.2929 1700 0.1279 0.9651
0.0504 4.5455 1800 0.1624 0.9588
0.0972 4.7980 1900 0.1579 0.9624
0.0456 5.0505 2000 0.1965 0.9490
0.0334 5.3030 2100 0.1652 0.9570
0.0242 5.5556 2200 0.1182 0.9749
0.0715 5.8081 2300 0.1250 0.9651
0.0407 6.0606 2400 0.1172 0.9696
0.0003 6.3131 2500 0.0819 0.9785
0.0072 6.5657 2600 0.1406 0.9714
0.0183 6.8182 2700 0.1152 0.9749
0.0021 7.0707 2800 0.1368 0.9731
0.046 7.3232 2900 0.0900 0.9794
0.033 7.5758 3000 0.1014 0.9785
0.0354 7.8283 3100 0.0968 0.9767
0.0026 8.0808 3200 0.1217 0.9731
0.0002 8.3333 3300 0.0828 0.9794
0.0006 8.5859 3400 0.0926 0.9794
0.0006 8.8384 3500 0.1001 0.9794
0.0006 9.0909 3600 0.0863 0.9848
0.0633 9.3434 3700 0.0911 0.9803
0.0009 9.5960 3800 0.0941 0.9821
0.0247 9.8485 3900 0.0988 0.9785

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1