MAE-CT-M1N0-M12_v8_split3_v3

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0548
  • Accuracy: 0.8462

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 10350

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6676 0.0068 70 0.6765 0.6538
0.6735 1.0068 140 0.6379 0.6667
0.6391 2.0068 210 0.6261 0.6667
0.8085 3.0068 280 0.6050 0.6667
0.4774 4.0068 350 0.5725 0.6538
0.5668 5.0068 420 0.5514 0.7179
0.6607 6.0068 490 0.4834 0.7564
1.0769 7.0068 560 0.7450 0.6667
0.5425 8.0068 630 0.4623 0.7949
0.2959 9.0068 700 0.5113 0.7564
0.7674 10.0068 770 0.5330 0.7308
0.3015 11.0068 840 0.6790 0.7436
0.6252 12.0068 910 1.6537 0.6667
0.4554 13.0068 980 0.8944 0.7564
0.364 14.0068 1050 0.8103 0.7179
0.444 15.0068 1120 0.7103 0.7821
0.104 16.0068 1190 0.9000 0.7179
0.5647 17.0068 1260 1.4782 0.7051
0.783 18.0068 1330 0.8539 0.8205
0.5938 19.0068 1400 0.8426 0.7564
0.5341 20.0068 1470 0.9862 0.7949
0.3391 21.0068 1540 1.1159 0.7692
0.2071 22.0068 1610 1.5833 0.7692
0.1159 23.0068 1680 1.0205 0.8205
0.1579 24.0068 1750 1.4633 0.7692
0.1042 25.0068 1820 1.5864 0.7564
0.1466 26.0068 1890 1.2990 0.7692
0.0006 27.0068 1960 1.5597 0.7564
0.001 28.0068 2030 1.6434 0.7564
0.0096 29.0068 2100 1.4161 0.7821
0.0015 30.0068 2170 1.3385 0.7949
0.3627 31.0068 2240 1.7864 0.7436
0.1541 32.0068 2310 1.5618 0.7821
0.1285 33.0068 2380 1.3062 0.7949
0.2193 34.0068 2450 1.5554 0.7821
0.0002 35.0068 2520 1.5445 0.7949
0.0003 36.0068 2590 1.7189 0.7821
0.092 37.0068 2660 1.4980 0.7692
0.2403 38.0068 2730 2.0993 0.7436
0.0001 39.0068 2800 1.8646 0.7436
0.0001 40.0068 2870 1.6642 0.7436
0.2108 41.0068 2940 1.4566 0.7949
0.1085 42.0068 3010 1.2915 0.7821
0.0019 43.0068 3080 1.2691 0.8333
0.0072 44.0068 3150 2.2703 0.7179
0.0002 45.0068 3220 1.5151 0.7564
0.4057 46.0068 3290 1.4170 0.7564
0.0862 47.0068 3360 2.1072 0.7308
0.0478 48.0068 3430 1.6826 0.7949
0.001 49.0068 3500 1.4349 0.7564
0.0001 50.0068 3570 1.4423 0.7949
0.0056 51.0068 3640 2.2833 0.7051
0.0004 52.0068 3710 1.0548 0.8462
0.1768 53.0068 3780 1.2974 0.8077
0.0001 54.0068 3850 1.2797 0.8333
0.1027 55.0068 3920 1.5159 0.8077
0.1638 56.0068 3990 1.9403 0.7564
0.0001 57.0068 4060 1.5075 0.8077
0.0003 58.0068 4130 2.1291 0.7436
0.004 59.0068 4200 1.6104 0.7949
0.0214 60.0068 4270 1.7017 0.7949
0.0 61.0068 4340 1.6485 0.8205
0.0 62.0068 4410 1.6668 0.8077
0.1604 63.0068 4480 1.7437 0.7949
0.0002 64.0068 4550 1.6770 0.7821
0.0 65.0068 4620 1.7766 0.7821
0.0214 66.0068 4690 1.6351 0.7821
0.0 67.0068 4760 1.6878 0.7821
0.0 68.0068 4830 1.8764 0.7564
0.2082 69.0068 4900 1.7799 0.7564
0.0 70.0068 4970 1.7388 0.7949
0.0 71.0068 5040 1.6719 0.7949
0.0001 72.0068 5110 1.6066 0.7949
0.0001 73.0068 5180 2.1181 0.7564
0.0 74.0068 5250 2.1773 0.7564
0.0 75.0068 5320 2.5632 0.7179
0.0 76.0068 5390 1.1549 0.8077
0.0006 77.0068 5460 2.2296 0.7436
0.0 78.0068 5530 2.5382 0.7308
0.0001 79.0068 5600 1.5726 0.7821
0.0002 80.0068 5670 1.5704 0.8077
0.0199 81.0068 5740 1.5503 0.8077
0.0001 82.0068 5810 1.3654 0.8077
0.0 83.0068 5880 1.4727 0.8077
0.0001 84.0068 5950 2.2248 0.7179
0.0 85.0068 6020 1.7188 0.7821
0.0101 86.0068 6090 1.9271 0.7436
0.0001 87.0068 6160 1.5356 0.8333
0.0 88.0068 6230 1.6683 0.8077
0.0 89.0068 6300 2.5566 0.7308
0.0 90.0068 6370 1.7776 0.8077
0.0028 91.0068 6440 1.7494 0.7949
0.0 92.0068 6510 1.6141 0.8077
0.0 93.0068 6580 2.8243 0.7179
0.0 94.0068 6650 1.9026 0.7051
0.0002 95.0068 6720 2.2368 0.6667
0.0998 96.0068 6790 2.4985 0.7179
0.0 97.0068 6860 2.0686 0.7436
0.0074 98.0068 6930 2.7098 0.7051
0.0 99.0068 7000 2.2765 0.7308
0.0 100.0068 7070 2.2793 0.7308
0.0 101.0068 7140 2.2027 0.7308
0.0 102.0068 7210 2.2387 0.7308
0.0 103.0068 7280 2.1971 0.7308
0.0 104.0068 7350 2.3246 0.7179
0.0 105.0068 7420 1.5935 0.8077
0.0 106.0068 7490 1.4796 0.8077
0.0001 107.0068 7560 1.7052 0.7821
0.0 108.0068 7630 1.6022 0.8077
0.0002 109.0068 7700 1.6749 0.8077
0.0 110.0068 7770 1.7948 0.7436
0.0 111.0068 7840 1.8455 0.7949
0.0 112.0068 7910 1.8600 0.7949
0.0 113.0068 7980 1.8183 0.7949
0.0 114.0068 8050 1.7862 0.7949
0.0 115.0068 8120 1.8597 0.7821
0.0 116.0068 8190 1.8203 0.7821
0.0 117.0068 8260 1.8343 0.7949
0.0 118.0068 8330 1.8417 0.7949
0.0 119.0068 8400 1.7663 0.7821
0.0 120.0068 8470 1.9611 0.7821
0.0 121.0068 8540 1.9584 0.7949
0.0 122.0068 8610 1.5671 0.8205
0.0 123.0068 8680 2.3456 0.7564
0.0 124.0068 8750 2.3453 0.7564
0.0 125.0068 8820 2.4120 0.7436
0.0 126.0068 8890 2.3774 0.7436
0.0 127.0068 8960 2.3609 0.7436
0.0 128.0068 9030 2.3531 0.7564
0.0 129.0068 9100 1.9910 0.7821
0.0 130.0068 9170 2.0032 0.7821
0.0 131.0068 9240 2.0645 0.7692
0.0 132.0068 9310 2.0598 0.7692
0.0 133.0068 9380 2.0594 0.7692
0.0 134.0068 9450 2.0568 0.7692
0.0 135.0068 9520 2.0522 0.7821
0.0 136.0068 9590 1.9971 0.7564
0.0 137.0068 9660 1.9977 0.7564
0.0 138.0068 9730 2.0896 0.7692
0.0 139.0068 9800 2.1550 0.7692
0.0 140.0068 9870 2.1875 0.7692
0.0 141.0068 9940 2.1874 0.7692
0.0 142.0068 10010 2.1822 0.7692
0.0 143.0068 10080 2.1818 0.7692
0.0 144.0068 10150 2.1806 0.7692
0.0 145.0068 10220 2.1803 0.7692
0.0 146.0068 10290 2.1804 0.7692
0.0 147.0058 10350 2.1803 0.7692

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
18
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-M1N0-M12_v8_split3_v3

Finetuned
(19)
this model