whisper-small-si
This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8059
- Wer Ortho: 79.8913
- Wer: 68.1928
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 50
- training_steps: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
---|---|---|---|---|---|
No log | 0.4 | 2 | 2.3675 | 183.6957 | 250.3614 |
No log | 0.8 | 4 | 2.3670 | 183.6957 | 250.3614 |
No log | 1.2 | 6 | 2.3486 | 173.3696 | 254.6988 |
No log | 1.6 | 8 | 2.2915 | 184.2391 | 254.2169 |
No log | 2.0 | 10 | 2.1795 | 179.8913 | 255.1807 |
No log | 2.4 | 12 | 2.0471 | 203.2609 | 239.5181 |
No log | 2.8 | 14 | 1.9688 | 225.0 | 236.8675 |
No log | 3.2 | 16 | 1.8879 | 228.8043 | 220.7229 |
No log | 3.6 | 18 | 1.8019 | 259.7826 | 249.1566 |
No log | 4.0 | 20 | 1.7219 | 242.3913 | 239.0361 |
No log | 4.4 | 22 | 1.6463 | 178.2609 | 230.8434 |
No log | 4.8 | 24 | 1.5752 | 156.5217 | 229.3976 |
2.0076 | 5.2 | 26 | 1.5220 | 163.5870 | 232.5301 |
2.0076 | 5.6 | 28 | 1.4893 | 173.9130 | 241.4458 |
2.0076 | 6.0 | 30 | 1.4547 | 188.0435 | 234.2169 |
2.0076 | 6.4 | 32 | 1.4254 | 132.0652 | 175.1807 |
2.0076 | 6.8 | 34 | 1.3839 | 138.5870 | 156.8675 |
2.0076 | 7.2 | 36 | 1.3475 | 117.3913 | 128.6747 |
2.0076 | 7.6 | 38 | 1.3150 | 132.0652 | 161.6867 |
2.0076 | 8.0 | 40 | 1.2866 | 132.0652 | 168.4337 |
2.0076 | 8.4 | 42 | 1.2552 | 119.5652 | 127.4699 |
2.0076 | 8.8 | 44 | 1.2086 | 107.0652 | 147.2289 |
2.0076 | 9.2 | 46 | 1.1754 | 98.3696 | 112.7711 |
2.0076 | 9.6 | 48 | 1.1426 | 92.3913 | 102.1687 |
1.1338 | 10.0 | 50 | 1.1153 | 88.5870 | 102.6506 |
1.1338 | 10.4 | 52 | 1.0775 | 85.3261 | 79.7590 |
1.1338 | 10.8 | 54 | 1.0330 | 84.7826 | 96.3855 |
1.1338 | 11.2 | 56 | 1.0069 | 93.4783 | 101.2048 |
1.1338 | 11.6 | 58 | 0.9822 | 93.4783 | 75.1807 |
1.1338 | 12.0 | 60 | 0.9659 | 82.0652 | 81.6867 |
1.1338 | 12.4 | 62 | 0.9490 | 84.7826 | 87.7108 |
1.1338 | 12.8 | 64 | 0.9142 | 78.8043 | 73.4940 |
1.1338 | 13.2 | 66 | 0.8838 | 88.0435 | 72.0482 |
1.1338 | 13.6 | 68 | 0.8767 | 80.9783 | 66.2651 |
1.1338 | 14.0 | 70 | 0.8648 | 79.3478 | 66.5060 |
1.1338 | 14.4 | 72 | 0.8439 | 80.4348 | 66.2651 |
1.1338 | 14.8 | 74 | 0.8417 | 78.8043 | 64.8193 |
0.4647 | 15.2 | 76 | 0.8321 | 78.8043 | 71.8072 |
0.4647 | 15.6 | 78 | 0.8000 | 73.3696 | 59.0361 |
0.4647 | 16.0 | 80 | 0.7904 | 73.3696 | 57.1084 |
0.4647 | 16.4 | 82 | 0.8084 | 76.0870 | 69.3976 |
0.4647 | 16.8 | 84 | 0.7965 | 77.1739 | 60.9639 |
0.4647 | 17.2 | 86 | 0.8039 | 80.4348 | 70.8434 |
0.4647 | 17.6 | 88 | 0.7852 | 76.6304 | 65.3012 |
0.4647 | 18.0 | 90 | 0.7896 | 78.8043 | 64.8193 |
0.4647 | 18.4 | 92 | 0.7829 | 110.8696 | 76.1446 |
0.4647 | 18.8 | 94 | 0.7860 | 76.6304 | 60.7229 |
0.4647 | 19.2 | 96 | 0.8047 | 74.4565 | 58.7952 |
0.4647 | 19.6 | 98 | 0.7939 | 71.7391 | 57.3494 |
0.1678 | 20.0 | 100 | 0.8059 | 79.8913 | 68.1928 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 41
Model tree for IshanSuga/whisper-small-si
Base model
openai/whisper-small