Beijuka's picture
End of training
a2f4650 verified
metadata
library_name: transformers
language:
  - sn
license: cc-by-nc-4.0
base_model: facebook/mms-300m
tags:
  - generated_from_trainer
datasets:
  - DigitalUmuganda_Afrivoice/Shona
metrics:
  - wer
model-index:
  - name: facebook/mms-300m
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: DigitalUmuganda
          type: DigitalUmuganda_Afrivoice/Shona
        metrics:
          - name: Wer
            type: wer
            value: 0.9947519315807376

facebook/mms-300m

This model is a fine-tuned version of facebook/mms-300m on the DigitalUmuganda dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0090
  • Wer: 0.9948
  • Cer: 0.4189

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
68.8683 0.8696 5 16.0023 1.0 0.9538
67.5244 1.8261 10 15.5550 1.0 0.9430
61.8287 2.7826 15 12.7110 1.0 1.0
38.8336 3.9130 21 7.7816 1.0 1.0
28.1041 4.8696 26 5.3122 1.0 1.0
20.7205 5.8261 31 4.4709 1.0 1.0
17.9969 6.7826 36 4.1134 1.0 1.0
13.9168 7.9130 42 3.7606 1.0 1.0
15.5405 8.8696 47 3.6320 1.0 1.0
14.8928 9.8261 52 3.4072 1.0 1.0
13.9935 10.7826 57 3.2393 1.0 1.0
11.1442 11.9130 63 3.1078 1.0 1.0
12.9364 12.8696 68 3.0432 1.0 1.0
12.7554 13.8261 73 2.9975 1.0 1.0
12.5164 14.7826 78 2.9721 1.0 1.0
10.3626 15.9130 84 2.9607 1.0 1.0
12.3881 16.8696 89 2.9378 1.0 1.0
13.1535 17.8261 94 2.9394 1.0 1.0
12.2578 18.7826 99 2.9396 1.0 1.0
10.2163 19.9130 105 2.9266 1.0 1.0
12.2213 20.8696 110 2.9475 1.0 1.0
12.2561 21.8261 115 2.9099 1.0 1.0
12.2262 22.7826 120 2.9104 1.0 1.0
10.1776 23.9130 126 2.9074 1.0 1.0
12.157 24.8696 131 2.8994 1.0 1.0
12.1486 25.8261 136 2.9076 1.0 1.0
12.156 26.7826 141 2.9197 1.0 1.0
10.1287 27.9130 147 2.8963 1.0 1.0
12.1832 28.8696 152 2.9037 1.0 1.0
12.1224 29.8261 157 2.8914 1.0 1.0
12.094 30.7826 162 2.8870 1.0 1.0
10.0635 31.9130 168 2.9047 1.0 1.0
12.1146 32.8696 173 2.8729 1.0 1.0
12.0052 33.8261 178 2.8594 1.0 1.0
12.0124 34.7826 183 2.8509 1.0 1.0
10.0458 35.9130 189 2.9509 1.0 1.0
12.1021 36.8696 194 2.8455 1.0 1.0
11.8997 37.8261 199 2.8262 1.0 1.0
11.8208 38.7826 204 2.8231 1.0 1.0
9.8093 39.9130 210 2.8053 1.0 1.0
11.724 40.8696 215 2.7914 1.0 1.0
11.6571 41.8261 220 2.7760 1.0 1.0
11.5854 42.7826 225 2.7794 1.0 1.0
9.59 43.9130 231 2.7130 1.0 1.0
11.3268 44.8696 236 2.6681 1.0 0.9903
11.1653 45.8261 241 2.6518 1.0 0.9273
10.9978 46.7826 246 2.5816 1.0 0.9112
8.9489 47.9130 252 2.5036 1.0 0.9026
10.4282 48.8696 257 2.4504 1.0 0.8652
10.2606 49.8261 262 2.3633 1.0 0.8550
9.8352 50.7826 267 2.2941 1.0 0.8328
7.8901 51.9130 273 2.1868 1.0 0.8017
9.1182 52.8696 278 2.1071 1.0 0.7764
8.7192 53.8261 283 2.0319 1.0 0.6983
8.287 54.7826 288 1.9401 1.0 0.6283
6.5955 55.9130 294 1.8406 1.0 0.6137
7.5011 56.8696 299 1.7786 1.0 0.5562
7.1636 57.8261 304 1.7299 1.0 0.5691
6.8314 58.7826 309 1.6673 1.0 0.5061
5.4578 59.9130 315 1.6018 1.0 0.4657
6.2479 60.8696 320 1.5645 1.0 0.4674
5.8711 61.8261 325 1.5283 1.0 0.4500
5.6188 62.7826 330 1.4788 1.0 0.4256
4.4687 63.9130 336 1.4583 1.0 0.4233
5.0484 64.8696 341 1.4361 1.0 0.4096
4.8114 65.8261 346 1.4144 1.0 0.4062
4.4987 66.7826 351 1.4102 1.0 0.4007
3.5595 67.9130 357 1.4017 0.9998 0.3897
4.0309 68.8696 362 1.4033 0.9990 0.3850
3.8135 69.8261 367 1.3971 0.9981 0.3836
3.5936 70.7826 372 1.4172 0.9946 0.3778
2.8143 71.9130 378 1.4172 0.9803 0.3630
3.2126 72.8696 383 1.4275 0.9895 0.3667
3.0545 73.8261 388 1.4452 0.9735 0.3605
2.8559 74.7826 393 1.4479 0.9796 0.3606
2.2344 75.9130 399 1.4668 0.9679 0.3530
2.5623 76.8696 404 1.4827 0.9725 0.3532
2.4213 77.8261 409 1.5082 0.9803 0.3557
2.3092 78.7826 414 1.5162 0.9788 0.3519
1.8793 79.9130 420 1.5187 0.9672 0.3479
2.1443 80.8696 425 1.5260 0.9708 0.3518
2.0246 81.8261 430 1.5579 1.0002 0.3492
1.9691 82.7826 435 1.5867 1.0066 0.3532
1.6321 83.9130 441 1.5419 0.9769 0.3463
1.9106 84.8696 446 1.5800 0.9757 0.3477
1.8555 85.8261 451 1.5746 0.9669 0.3447
1.8178 86.7826 456 1.6088 0.9745 0.3452
1.4789 87.9130 462 1.5913 0.9803 0.3466
1.7375 88.8696 467 1.5916 0.9740 0.3454
1.6749 89.8261 472 1.6099 0.9647 0.3469
1.7118 90.7826 477 1.6322 1.0005 0.3522
1.3896 91.9130 483 1.6341 0.9715 0.3473
1.5778 92.8696 488 1.6404 0.9861 0.3469
1.607 93.8261 493 1.6349 0.9747 0.3475
1.6138 94.7826 498 1.6354 0.9740 0.3470
0.9601 95.2174 500 1.6371 0.9747 0.3467

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.1.0+cu118
  • Datasets 3.0.2
  • Tokenizers 0.20.1