CTMAE-P2-V2-S3

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3307
  • Accuracy: 0.7826

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 3250

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6138 0.02 65 0.7342 0.5435
0.5607 1.02 130 0.7364 0.5435
0.5525 2.02 195 0.9435 0.5435
0.6208 3.02 260 0.7286 0.5870
0.6099 4.02 325 0.7308 0.5435
0.4065 5.02 390 1.0573 0.5435
0.4338 6.02 455 0.6411 0.5870
0.4706 7.02 520 0.5360 0.6957
0.4147 8.02 585 1.0753 0.5870
0.6016 9.02 650 1.1059 0.5652
0.8252 10.02 715 0.5255 0.6739
0.3944 11.02 780 0.8291 0.5870
0.8652 12.02 845 0.7266 0.5652
0.4453 13.02 910 0.8914 0.6304
0.3375 14.02 975 1.4725 0.5870
0.5423 15.02 1040 0.6293 0.6739
0.4718 16.02 1105 0.9326 0.5870
0.4305 17.02 1170 0.6471 0.6957
0.1434 18.02 1235 1.1059 0.6739
0.3129 19.02 1300 0.7987 0.7391
0.2712 20.02 1365 0.9583 0.6957
0.5669 21.02 1430 1.9777 0.5652
0.3252 22.02 1495 0.9683 0.6739
0.1086 23.02 1560 1.0589 0.7391
0.2289 24.02 1625 1.0725 0.7174
0.2527 25.02 1690 1.0045 0.7609
0.433 26.02 1755 1.1574 0.6304
0.3203 27.02 1820 1.0995 0.7174
0.65 28.02 1885 1.4326 0.6957
0.1041 29.02 1950 1.2175 0.7174
0.0569 30.02 2015 1.4499 0.7174
0.2142 31.02 2080 1.3656 0.7609
0.343 32.02 2145 1.3127 0.7609
0.0331 33.02 2210 1.5137 0.6957
0.1634 34.02 2275 1.4774 0.7174
0.5358 35.02 2340 1.5174 0.6739
0.0396 36.02 2405 1.4475 0.6957
0.7272 37.02 2470 1.6262 0.7174
0.3445 38.02 2535 1.7874 0.7174
0.2743 39.02 2600 1.5739 0.7174
0.2592 40.02 2665 1.6093 0.7174
0.1367 41.02 2730 1.5615 0.7174
0.0459 42.02 2795 1.5238 0.7174
0.0102 43.02 2860 1.4447 0.7174
0.0084 44.02 2925 1.3307 0.7826
0.106 45.02 2990 1.3326 0.7609
0.0796 46.02 3055 1.3509 0.7391
0.0004 47.02 3120 1.5341 0.6522
0.0004 48.02 3185 1.3930 0.7391
0.0033 49.02 3250 1.4388 0.7174

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
35
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V2-S3

Finetuned
(49)
this model