CTMAE2_CS_V7_1

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4206
  • Accuracy: 0.9130

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 38850

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0024 0.02 777 2.8391 0.4565
1.8322 1.02 1554 2.3824 0.4565
0.8493 2.02 2331 1.1138 0.4565
0.4768 3.02 3108 1.7405 0.5
2.0611 4.02 3885 1.0070 0.7826
0.4943 5.02 4662 3.2575 0.4565
0.708 6.02 5439 2.1191 0.4565
0.5518 7.02 6216 1.4170 0.7609
1.2084 8.02 6993 0.6194 0.8478
1.4958 9.02 7770 0.6078 0.8478
1.7393 10.02 8547 0.7951 0.7609
1.0987 11.02 9324 0.7860 0.8261
0.8703 12.02 10101 0.9879 0.7826
1.0987 13.02 10878 1.0684 0.7826
0.0074 14.02 11655 1.2901 0.7391
1.39 15.02 12432 1.6628 0.6304
0.2423 16.02 13209 0.4206 0.9130
1.4765 17.02 13986 0.5211 0.8478
0.4732 18.02 14763 0.8912 0.8261
0.8663 19.02 15540 0.4561 0.8696
1.0169 20.02 16317 0.6229 0.8696
0.0059 21.02 17094 0.7054 0.8478
0.8826 22.02 17871 1.0915 0.7391
0.4632 23.02 18648 0.6940 0.8913
0.4901 24.02 19425 1.0199 0.8261
0.3927 25.02 20202 0.8669 0.8261
0.3564 26.02 20979 0.7220 0.8696
0.0017 27.02 21756 1.2441 0.7826
1.1658 28.02 22533 0.6606 0.8696
0.6756 29.02 23310 1.6571 0.7174
0.0165 30.02 24087 0.7047 0.8696
1.7687 31.02 24864 0.6157 0.8696
0.0036 32.02 25641 1.0989 0.8478
0.6257 33.02 26418 0.9701 0.8261
0.8758 34.02 27195 0.7716 0.8696
0.5316 35.02 27972 0.9176 0.8478
0.0003 36.02 28749 1.0744 0.8478
0.0005 37.02 29526 0.9155 0.8696
0.2137 38.02 30303 0.5414 0.9130
0.0001 39.02 31080 0.9475 0.8261
0.0022 40.02 31857 1.0578 0.8478
0.0002 41.02 32634 0.9568 0.8696
0.0004 42.02 33411 0.9224 0.8696
0.0 43.02 34188 0.8144 0.8913
0.0001 44.02 34965 0.9011 0.8913
0.6661 45.02 35742 1.0514 0.8478
0.0001 46.02 36519 0.8273 0.8913
0.0 47.02 37296 0.7097 0.8913
0.6658 48.02 38073 0.8315 0.8913
0.0 49.02 38850 0.8576 0.8913

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
22
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE2_CS_V7_1

Finetuned
(49)
this model