detr-amzss3-v2 / README.md
pallavJha's picture
update model card README.md
3abbb02
|
raw
history blame
3.64 kB
metadata
license: apache-2.0
base_model: facebook/detr-resnet-50
tags:
  - generated_from_trainer
model-index:
  - name: detr-amzss3-v2
    results: []

detr-amzss3-v2

This model is a fine-tuned version of facebook/detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5846

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss
No log 0.54 1000 2.5308
2.824 1.08 2000 2.0484
2.824 1.62 3000 1.7408
1.8911 2.16 4000 1.5862
1.8911 2.7 5000 1.4858
1.594 3.24 6000 1.3551
1.594 3.78 7000 1.2802
1.4147 4.32 8000 1.2439
1.4147 4.86 9000 1.1548
1.2978 5.4 10000 1.1031
1.2978 5.94 11000 1.0674
1.1984 6.48 12000 1.0380
1.1086 7.02 13000 0.9949
1.1086 7.56 14000 0.9393
1.0383 8.1 15000 0.9204
1.0383 8.64 16000 0.8921
0.9817 9.18 17000 0.8670
0.9817 9.72 18000 0.8250
0.9277 10.26 19000 0.8084
0.9277 10.8 20000 0.7968
0.8864 11.34 21000 0.7928
0.8864 11.88 22000 0.7605
0.8525 12.42 23000 0.7602
0.8525 12.96 24000 0.7406
0.8197 13.5 25000 0.7224
0.7975 14.04 26000 0.7060
0.7975 14.58 27000 0.6893
0.7733 15.12 28000 0.6940
0.7733 15.66 29000 0.6836
0.7534 16.2 30000 0.6620
0.7534 16.74 31000 0.6584
0.7376 17.28 32000 0.6552
0.7376 17.82 33000 0.6487
0.7242 18.36 34000 0.6334
0.7242 18.9 35000 0.6319
0.7052 19.44 36000 0.6223
0.7052 19.98 37000 0.6155
0.6935 20.52 38000 0.6092
0.6816 21.06 39000 0.6079
0.6816 21.6 40000 0.6045
0.6747 22.14 41000 0.5997
0.6747 22.68 42000 0.6002
0.6693 23.22 43000 0.5924
0.6693 23.76 44000 0.5922
0.6608 24.3 45000 0.5861
0.6608 24.84 46000 0.5846

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3