CTMAE-P2-V3-3G-S2

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9771
  • Accuracy: 0.8

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 3250

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6262 0.0203 66 0.8422 0.4
0.5961 1.0203 132 0.9177 0.4
0.6691 2.0203 198 0.8617 0.4
0.6242 3.0203 264 0.9506 0.4
0.631 4.0203 330 0.7678 0.4
0.6747 5.0203 396 0.6632 0.6
0.6345 6.0203 462 0.7091 0.5556
0.6357 7.0203 528 1.0969 0.5111
0.4717 8.0203 594 0.6719 0.6444
0.827 9.0203 660 0.5489 0.7778
0.3107 10.0203 726 0.7150 0.6889
0.3366 11.0203 792 0.7248 0.7111
0.8919 12.0203 858 0.6667 0.7778
0.4823 13.0203 924 2.1050 0.4222
0.3742 14.0203 990 1.0017 0.5778
0.3399 15.0203 1056 1.5679 0.5556
0.6571 16.0203 1122 1.3521 0.6
0.2434 17.0203 1188 0.7812 0.7778
0.5967 18.0203 1254 0.9575 0.6889
0.1982 19.0203 1320 1.1721 0.6667
0.2631 20.0203 1386 2.5733 0.4667
0.3235 21.0203 1452 0.9771 0.8
0.1786 22.0203 1518 1.1978 0.7111
0.1352 23.0203 1584 0.8692 0.7778
0.1709 24.0203 1650 1.1424 0.7556
0.0873 25.0203 1716 1.8760 0.6222
0.1418 26.0203 1782 1.0964 0.8
0.0075 27.0203 1848 1.9130 0.6222
0.4534 28.0203 1914 1.1176 0.7778
0.0019 29.0203 1980 2.0684 0.6222
0.4098 30.0203 2046 1.9198 0.6667
0.0006 31.0203 2112 1.2724 0.7333
0.1891 32.0203 2178 1.8213 0.6444
0.0604 33.0203 2244 2.6845 0.5556
0.104 34.0203 2310 2.7468 0.6222
0.028 35.0203 2376 1.4458 0.7333
0.2479 36.0203 2442 2.1613 0.6222
0.0967 37.0203 2508 1.3895 0.7778
0.0668 38.0203 2574 2.0147 0.6667
0.0004 39.0203 2640 1.5766 0.6889
0.0027 40.0203 2706 2.3533 0.6444
0.1436 41.0203 2772 2.1496 0.6444
0.0327 42.0203 2838 2.2866 0.6444
0.0349 43.0203 2904 2.2496 0.6667
0.1178 44.0203 2970 1.8929 0.6889
0.0001 45.0203 3036 1.9030 0.7111
0.1289 46.0203 3102 1.9212 0.6889
0.137 47.0203 3168 1.6077 0.7556
0.0004 48.0203 3234 1.7398 0.7333
0.0001 49.0049 3250 1.7423 0.7333

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
28
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V3-3G-S2

Finetuned
(49)
this model