You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

facebook/mms-1b-all

This model is a fine-tuned version of facebook/mms-1b-all on the DigitalUmuganda dataset. It achieves the following results on the evaluation set:

  • Loss: 9.7260
  • Model Preparation Time: 0.0108
  • Wer: 1.0
  • Cer: 1.0263

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
61.2558 0.8696 5 14.2292 0.0108 1.0 2.1395
60.5933 1.8261 10 14.0103 0.0108 1.0 2.0649
59.6126 2.7826 15 13.5539 0.0108 1.0 1.9202
46.7809 3.9130 21 12.4379 0.0108 1.0 1.6586
51.2006 4.8696 26 11.1727 0.0108 1.0 1.3341
45.5066 5.8261 31 9.7087 0.0108 0.9998 0.9584
38.5399 6.7826 36 8.2230 0.0108 1.0 0.8930
26.7322 7.9130 42 6.7125 0.0108 1.0 0.9583
27.0689 8.8696 47 5.8420 0.0108 1.0 0.9882
23.885 9.8261 52 5.3230 0.0108 1.0 0.9931
22.2475 10.7826 57 5.0217 0.0108 1.0 0.9941
17.7159 11.9130 63 4.8187 0.0108 1.0 0.9934
20.6729 12.8696 68 4.7208 0.0108 1.0 0.9943
20.2555 13.8261 73 4.6241 0.0108 1.0 0.9943
19.8974 14.7826 78 4.5231 0.0108 1.0 0.9915
16.2179 15.9130 84 4.3843 0.0108 1.0 0.9906
18.2735 16.8696 89 4.2386 0.0108 1.0 0.9658
18.3992 17.8261 94 4.0855 0.0108 1.0 0.9008
17.9017 18.7826 99 3.9638 0.0108 1.0012 0.8728
14.4746 19.9130 105 3.8507 0.0108 1.0 0.8665
17.0257 20.8696 110 3.7610 0.0108 1.0 0.8752
16.6174 21.8261 115 3.6873 0.0108 1.0058 0.8401
16.2618 22.7826 120 3.6239 0.0108 1.0 0.8587
13.3621 23.9130 126 3.5762 0.0108 1.0 0.8542
15.8358 24.8696 131 3.5434 0.0108 1.0119 0.8190
15.5278 25.8261 136 3.5110 0.0108 1.0010 0.8433
15.2012 26.7826 141 3.4711 0.0108 1.0032 0.8368
12.5298 27.9130 147 3.4341 0.0108 1.0066 0.8298
14.8525 28.8696 152 3.4105 0.0108 1.0085 0.8298
14.5351 29.8261 157 3.3788 0.0108 1.0268 0.8057
14.4403 30.7826 162 3.3549 0.0108 1.0039 0.8339
11.836 31.9130 168 3.3474 0.0108 1.0051 0.8313
14.2697 32.8696 173 3.2991 0.0108 1.0482 0.7892
13.8508 33.8261 178 3.3245 0.0108 1.0029 0.8278
13.6973 34.7826 183 3.2907 0.0108 1.0005 0.8051
11.2833 35.9130 189 3.3046 0.0108 1.0100 0.8228
13.3111 36.8696 194 3.2605 0.0108 1.0153 0.8002
13.2858 37.8261 199 3.2802 0.0108 1.0075 0.8115
13.3125 38.7826 204 3.2877 0.0108 1.0107 0.8076
10.811 39.9130 210 3.2335 0.0108 1.0083 0.7995
12.9063 40.8696 215 3.2435 0.0108 1.0071 0.8072
12.8105 41.8261 220 3.2516 0.0108 1.0054 0.8146
12.6894 42.7826 225 3.2634 0.0108 1.0199 0.7984
10.4284 43.9130 231 3.2434 0.0108 1.0095 0.7988
12.5526 44.8696 236 3.2996 0.0108 1.0039 0.8245
12.4532 45.8261 241 3.2234 0.0108 1.0231 0.7907
12.4143 46.7826 246 3.2599 0.0108 1.0029 0.8155
10.2517 47.9130 252 3.2585 0.0108 1.0063 0.8053
12.4357 48.8696 257 3.2407 0.0108 1.0236 0.7921
12.2622 49.8261 262 3.2640 0.0108 1.0039 0.8084
12.1975 50.7826 267 3.2645 0.0108 1.0117 0.8033
10.1193 51.9130 273 3.2395 0.0108 1.0285 0.7824
12.0931 52.8696 278 3.2823 0.0108 1.0034 0.8013
12.2157 53.8261 283 3.2564 0.0108 1.0124 0.7909
12.0174 54.7826 288 3.2492 0.0108 1.0294 0.7742
9.9986 55.9130 294 3.2344 0.0108 1.0032 0.7960
11.9855 56.8696 299 3.2709 0.0108 1.0214 0.7785
11.9514 57.8261 304 3.2556 0.0108 1.0095 0.7893
11.9152 58.7826 309 3.2716 0.0108 1.0090 0.7857
9.8905 59.9130 315 3.3192 0.0108 1.0165 0.7794
11.9491 60.8696 320 3.2668 0.0108 1.0212 0.7717
11.8561 61.8261 325 3.2821 0.0108 1.0151 0.7769
11.8206 62.7826 330 3.2512 0.0108 1.0370 0.7625
9.8339 63.9130 336 3.3038 0.0108 1.0114 0.7772
11.7774 64.8696 341 3.2527 0.0108 1.0365 0.7566
11.7058 65.8261 346 3.3124 0.0108 1.0190 0.7691
11.7123 66.7826 351 3.2726 0.0108 1.0246 0.7655
9.7587 67.9130 357 3.2848 0.0108 1.0190 0.7630
11.7171 68.8696 362 3.2727 0.0108 1.0187 0.7622
11.6455 69.8261 367 3.2705 0.0108 1.0148 0.7609
11.6714 70.7826 372 3.2773 0.0108 1.0190 0.7566
9.6664 71.9130 378 3.2537 0.0108 1.0314 0.7496
11.6148 72.8696 383 3.2945 0.0108 1.0151 0.7616
11.5807 73.8261 388 3.3068 0.0108 1.0202 0.7563
11.5939 74.7826 393 3.2805 0.0108 1.0258 0.7471
9.63 75.9130 399 3.3203 0.0108 1.0124 0.7556
11.5625 76.8696 404 3.2776 0.0108 1.0158 0.7530
11.5198 77.8261 409 3.3043 0.0108 1.0209 0.7513
11.5147 78.7826 414 3.3050 0.0108 1.0195 0.7519
9.58 79.9130 420 3.2798 0.0108 1.0207 0.7472
11.4839 80.8696 425 3.2974 0.0108 1.0268 0.7435
11.4907 81.8261 430 3.2751 0.0108 1.0243 0.7434
11.4485 82.7826 435 3.3051 0.0108 1.0097 0.7520
9.5367 83.9130 441 3.2787 0.0108 1.0248 0.7407
11.4168 84.8696 446 3.3047 0.0108 1.0158 0.7476
11.4169 85.8261 451 3.2901 0.0108 1.0250 0.7394
11.3964 86.7826 456 3.3415 0.0108 1.0168 0.7476
9.4842 87.9130 462 3.2788 0.0108 1.0270 0.7371
11.3778 88.8696 467 3.3244 0.0108 1.0243 0.7421
11.3442 89.8261 472 3.3127 0.0108 1.0340 0.7390
11.3428 90.7826 477 3.3120 0.0108 1.0209 0.7412
9.4492 91.9130 483 3.3177 0.0108 1.0195 0.7418
11.3562 92.8696 488 3.2997 0.0108 1.0224 0.7367
11.3161 93.8261 493 3.3567 0.0108 1.0180 0.7448
11.2836 94.7826 498 3.2956 0.0108 1.0263 0.7358
9.404 95.9130 504 3.3472 0.0108 1.0131 0.7449
11.2805 96.8696 509 3.3002 0.0108 1.0297 0.7322
11.2702 97.8261 514 3.3282 0.0108 1.0136 0.7417
11.2554 98.7826 519 3.3361 0.0108 1.0161 0.7388
9.3581 99.9130 525 3.3132 0.0108 1.0328 0.7329
11.2385 100.8696 530 3.3154 0.0108 1.0224 0.7349
11.2548 101.8261 535 3.3215 0.0108 1.0272 0.7323
11.2565 102.7826 540 3.3022 0.0108 1.0248 0.7312
9.3435 103.9130 546 3.3193 0.0108 1.0173 0.7353
11.2047 104.8696 551 3.3239 0.0108 1.0250 0.7335
11.2158 105.8261 556 3.3206 0.0108 1.0306 0.7310
11.1969 106.7826 561 3.3349 0.0108 1.0231 0.7335
9.2975 107.9130 567 3.3312 0.0108 1.0158 0.7346
11.1944 108.8696 572 3.3321 0.0108 1.0214 0.7330
11.1897 109.8261 577 3.3160 0.0108 1.0399 0.7269
11.1698 110.7826 582 3.3406 0.0108 1.0175 0.7347
9.2901 111.9130 588 3.3186 0.0108 1.0178 0.7316
11.1881 112.8696 593 3.3042 0.0108 1.0340 0.7262
11.1726 113.8261 598 3.3469 0.0108 1.0231 0.7342
11.0822 114.7826 603 3.3216 0.0108 1.0275 0.7289
9.2644 115.9130 609 3.3320 0.0108 1.0224 0.7311
11.1101 116.8696 614 3.3357 0.0108 1.0263 0.7300
11.1348 117.8261 619 3.3371 0.0108 1.0231 0.7314
11.0795 118.7826 624 3.3338 0.0108 1.0248 0.7298
9.242 119.9130 630 3.3327 0.0108 1.0260 0.7292
11.1011 120.8696 635 3.3183 0.0108 1.0255 0.7283
11.0889 121.8261 640 3.3466 0.0108 1.0204 0.7338
11.0732 122.7826 645 3.3447 0.0108 1.0287 0.7314
9.2586 123.9130 651 3.3184 0.0108 1.0348 0.7265
11.0636 124.8696 656 3.3406 0.0108 1.0302 0.7296
11.0775 125.8261 661 3.3429 0.0108 1.0302 0.7299
11.064 126.7826 666 3.3512 0.0108 1.0236 0.7310
9.2299 127.9130 672 3.3380 0.0108 1.0292 0.7269
11.0375 128.8696 677 3.3346 0.0108 1.0340 0.7267
11.0343 129.8261 682 3.3309 0.0108 1.0316 0.7256
11.0733 130.7826 687 3.3285 0.0108 1.0292 0.7275
9.2115 131.9130 693 3.3336 0.0108 1.0255 0.7283
11.0232 132.8696 698 3.3391 0.0108 1.0243 0.7294
11.0263 133.8261 703 3.3352 0.0108 1.0258 0.7280
11.0658 134.7826 708 3.3274 0.0108 1.0292 0.7272
9.1985 135.9130 714 3.3292 0.0108 1.0297 0.7284
11.0514 136.8696 719 3.3325 0.0108 1.0277 0.7287
11.0376 137.8261 724 3.3337 0.0108 1.0270 0.7287
11.0298 138.7826 729 3.3398 0.0108 1.0260 0.7294
9.1757 139.9130 735 3.3446 0.0108 1.0265 0.7294
11.0135 140.8696 740 3.3472 0.0108 1.0248 0.7301
11.0265 141.8261 745 3.3432 0.0108 1.0258 0.7286
9.4511 142.7826 750 3.3405 0.0108 1.0268 0.7286

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.1.0+cu118
  • Datasets 3.0.2
  • Tokenizers 0.20.1
Downloads last month
0
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/mms-1B_all_DigitalUmuganda_Afrivoice_Shona_1hr_v1

Finetuned
(238)
this model

Collection including asr-africa/mms-1B_all_DigitalUmuganda_Afrivoice_Shona_1hr_v1

Evaluation results