ViViT_WLASL_200_epochs_p20
This model is a fine-tuned version of google/vivit-b-16x2-kinetics400 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.4366
- Top 1 Accuracy: 0.2725
- Top 5 Accuracy: 0.5651
- Top 10 Accuracy: 0.6599
- Accuracy: 0.2727
- Precision: 0.2554
- Recall: 0.2727
- F1: 0.2419
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 357200
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Top 1 Accuracy | Top 5 Accuracy | Top 10 Accuracy | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|---|---|---|
30.7847 | 0.005 | 1786 | 7.6710 | 0.0003 | 0.0013 | 0.0043 | 0.0003 | 0.0000 | 0.0003 | 0.0000 |
30.4187 | 1.0050 | 3572 | 7.6049 | 0.0020 | 0.0056 | 0.0115 | 0.0020 | 0.0003 | 0.0020 | 0.0004 |
29.7427 | 2.0050 | 5358 | 7.4861 | 0.0036 | 0.0161 | 0.0271 | 0.0036 | 0.0009 | 0.0036 | 0.0008 |
28.4534 | 3.0050 | 7145 | 7.2492 | 0.0115 | 0.0375 | 0.0605 | 0.0115 | 0.0049 | 0.0115 | 0.0044 |
26.8561 | 4.005 | 8931 | 6.9297 | 0.0317 | 0.0766 | 0.1208 | 0.0319 | 0.0115 | 0.0319 | 0.0125 |
24.997 | 5.0050 | 10717 | 6.5686 | 0.0460 | 0.1292 | 0.1946 | 0.0460 | 0.0212 | 0.0460 | 0.0210 |
22.8381 | 6.0050 | 12503 | 6.2049 | 0.0746 | 0.1859 | 0.2605 | 0.0748 | 0.0386 | 0.0748 | 0.0382 |
20.56 | 7.0050 | 14290 | 5.8050 | 0.1021 | 0.2554 | 0.3463 | 0.1021 | 0.0591 | 0.1021 | 0.0594 |
18.323 | 8.005 | 16076 | 5.4230 | 0.1318 | 0.3113 | 0.4101 | 0.1325 | 0.0720 | 0.1325 | 0.0773 |
15.752 | 9.0050 | 17862 | 5.0506 | 0.1622 | 0.3598 | 0.4847 | 0.1619 | 0.0954 | 0.1619 | 0.1013 |
13.1613 | 10.0050 | 19648 | 4.6589 | 0.1933 | 0.4321 | 0.5564 | 0.1941 | 0.1294 | 0.1941 | 0.1352 |
10.1883 | 11.0050 | 21435 | 4.3067 | 0.2263 | 0.4946 | 0.6139 | 0.2263 | 0.1633 | 0.2263 | 0.1684 |
8.1852 | 12.005 | 23221 | 4.0124 | 0.2474 | 0.5314 | 0.6527 | 0.2472 | 0.1886 | 0.2472 | 0.1927 |
5.88 | 13.0050 | 25007 | 3.7567 | 0.2651 | 0.5720 | 0.6885 | 0.2651 | 0.2188 | 0.2651 | 0.2186 |
4.5173 | 14.0050 | 26793 | 3.6064 | 0.2763 | 0.5927 | 0.7125 | 0.2763 | 0.2440 | 0.2763 | 0.2373 |
2.851 | 15.0050 | 28580 | 3.5264 | 0.2776 | 0.6083 | 0.7196 | 0.2778 | 0.2534 | 0.2778 | 0.2434 |
2.2493 | 16.005 | 30366 | 3.5001 | 0.2725 | 0.6085 | 0.7183 | 0.2725 | 0.2471 | 0.2725 | 0.2383 |
1.9096 | 17.0050 | 32152 | 3.4933 | 0.2809 | 0.6014 | 0.7127 | 0.2814 | 0.2487 | 0.2814 | 0.2432 |
1.7109 | 18.0050 | 33938 | 3.5175 | 0.2776 | 0.5919 | 0.7068 | 0.2778 | 0.2517 | 0.2778 | 0.2418 |
1.4479 | 19.0050 | 35725 | 3.5182 | 0.2835 | 0.5986 | 0.7153 | 0.2835 | 0.2580 | 0.2835 | 0.2478 |
1.2076 | 20.005 | 37511 | 3.5383 | 0.2842 | 0.5981 | 0.7071 | 0.2842 | 0.2600 | 0.2842 | 0.2493 |
1.4093 | 21.0050 | 39297 | 3.6272 | 0.2699 | 0.5899 | 0.6918 | 0.2702 | 0.2458 | 0.2702 | 0.2359 |
1.3122 | 22.0050 | 41083 | 3.6791 | 0.2725 | 0.5822 | 0.6951 | 0.2722 | 0.2420 | 0.2722 | 0.2364 |
1.4774 | 23.0050 | 42870 | 3.7163 | 0.2758 | 0.5812 | 0.6941 | 0.2760 | 0.2548 | 0.2760 | 0.2417 |
1.1338 | 24.005 | 44656 | 3.7577 | 0.2676 | 0.5817 | 0.6897 | 0.2674 | 0.2471 | 0.2674 | 0.2346 |
1.1446 | 25.0050 | 46442 | 3.7662 | 0.2745 | 0.5958 | 0.7074 | 0.2745 | 0.2533 | 0.2745 | 0.2430 |
1.0876 | 26.0050 | 48228 | 3.8689 | 0.2789 | 0.5781 | 0.6851 | 0.2786 | 0.2525 | 0.2786 | 0.2435 |
1.0755 | 27.0050 | 50015 | 3.8659 | 0.2773 | 0.5855 | 0.7010 | 0.2773 | 0.2626 | 0.2773 | 0.2465 |
1.1092 | 28.005 | 51801 | 3.9151 | 0.2786 | 0.5891 | 0.6966 | 0.2789 | 0.2540 | 0.2789 | 0.2448 |
0.9877 | 29.0050 | 53587 | 4.0145 | 0.2832 | 0.5853 | 0.6902 | 0.2832 | 0.2596 | 0.2832 | 0.2480 |
0.946 | 30.0050 | 55373 | 3.9974 | 0.2763 | 0.5884 | 0.6915 | 0.2771 | 0.2626 | 0.2771 | 0.2478 |
1.1572 | 31.0050 | 57160 | 4.0120 | 0.2868 | 0.5787 | 0.6849 | 0.2870 | 0.2664 | 0.2870 | 0.2545 |
1.0663 | 32.005 | 58946 | 4.1235 | 0.2763 | 0.5702 | 0.6734 | 0.2763 | 0.2552 | 0.2763 | 0.2446 |
1.085 | 33.0050 | 60732 | 4.1426 | 0.2715 | 0.5718 | 0.6782 | 0.2715 | 0.2528 | 0.2715 | 0.2403 |
1.3799 | 34.0050 | 62518 | 4.1017 | 0.2755 | 0.5743 | 0.6839 | 0.2755 | 0.2594 | 0.2755 | 0.2445 |
0.8419 | 35.0050 | 64305 | 4.1769 | 0.2794 | 0.5728 | 0.6836 | 0.2794 | 0.2620 | 0.2794 | 0.2487 |
1.1308 | 36.005 | 66091 | 4.1333 | 0.2796 | 0.5804 | 0.6890 | 0.2794 | 0.2569 | 0.2794 | 0.2468 |
0.9785 | 37.0050 | 67877 | 4.2391 | 0.2771 | 0.5592 | 0.6655 | 0.2771 | 0.2617 | 0.2771 | 0.2471 |
0.923 | 38.0050 | 69663 | 4.2274 | 0.2804 | 0.5769 | 0.6683 | 0.2804 | 0.2583 | 0.2804 | 0.2484 |
0.9857 | 39.0050 | 71450 | 4.2033 | 0.2814 | 0.5830 | 0.6862 | 0.2819 | 0.2610 | 0.2819 | 0.2498 |
0.7679 | 40.005 | 73236 | 4.1983 | 0.2845 | 0.5861 | 0.6834 | 0.2845 | 0.2640 | 0.2845 | 0.2518 |
0.8991 | 41.0050 | 75022 | 4.2099 | 0.2812 | 0.5873 | 0.6882 | 0.2812 | 0.2713 | 0.2812 | 0.2533 |
1.1176 | 42.0050 | 76808 | 4.3419 | 0.2768 | 0.5687 | 0.6734 | 0.2766 | 0.2596 | 0.2766 | 0.2455 |
1.2777 | 43.0050 | 78595 | 4.3104 | 0.2783 | 0.5774 | 0.6742 | 0.2781 | 0.2610 | 0.2781 | 0.2469 |
0.8072 | 44.005 | 80381 | 4.3708 | 0.2778 | 0.5638 | 0.6616 | 0.2781 | 0.2686 | 0.2781 | 0.2500 |
1.1258 | 45.0050 | 82167 | 4.3517 | 0.2799 | 0.5707 | 0.6731 | 0.2801 | 0.2688 | 0.2801 | 0.2510 |
0.9476 | 46.0050 | 83953 | 4.3712 | 0.2814 | 0.5684 | 0.6757 | 0.2812 | 0.2583 | 0.2812 | 0.2473 |
1.0757 | 47.0050 | 85740 | 4.4280 | 0.2679 | 0.5641 | 0.6668 | 0.2676 | 0.2519 | 0.2676 | 0.2372 |
0.7244 | 48.005 | 87526 | 4.4374 | 0.2676 | 0.5677 | 0.6685 | 0.2674 | 0.2529 | 0.2674 | 0.2388 |
0.9193 | 49.0050 | 89312 | 4.4014 | 0.2748 | 0.5636 | 0.6662 | 0.2748 | 0.2617 | 0.2748 | 0.2467 |
0.8372 | 50.0050 | 91098 | 4.4763 | 0.2735 | 0.5585 | 0.6545 | 0.2735 | 0.2555 | 0.2735 | 0.2429 |
0.7598 | 51.0050 | 92885 | 4.4366 | 0.2725 | 0.5651 | 0.6599 | 0.2727 | 0.2554 | 0.2727 | 0.2419 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Shawon16/ViViT_WLASL_200_epochs_p20
Base model
google/vivit-b-16x2-kinetics400