You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

facebook mms-1b-all xhosa - Beijuka Bruno

This model is a fine-tuned version of facebook/mms-1b-all on the NCHLT_speech_corpus/Xhosa dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4679
  • Model Preparation Time: 0.0258
  • Wer: 0.6932
  • Cer: 0.1664

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
6.5237 0.9994 1171 0.2035 0.0258 0.3025 0.0476
1.1462 1.9994 2342 0.1968 0.0258 0.2911 0.0459
1.0575 2.9994 3513 0.1858 0.0258 0.2741 0.0436
1.0131 3.9994 4684 0.1810 0.0258 0.2671 0.0427
0.9527 4.9994 5855 0.1770 0.0258 0.2531 0.0410
0.9153 5.9994 7026 0.1704 0.0258 0.2478 0.0404
0.8709 6.9994 8197 0.1659 0.0258 0.2452 0.0396
0.832 7.9994 9368 0.1662 0.0258 0.2449 0.0397
0.8151 8.9994 10539 0.1648 0.0258 0.2342 0.0393
0.7945 9.9994 11710 0.1685 0.0258 0.2511 0.0398
0.7624 10.9994 12881 0.1638 0.0258 0.2352 0.0382
0.7361 11.9994 14052 0.1605 0.0258 0.2274 0.0372
0.7295 12.9994 15223 0.1593 0.0258 0.2176 0.0371
0.7048 13.9994 16394 0.1616 0.0258 0.2211 0.0370
0.6865 14.9994 17565 0.1583 0.0258 0.2107 0.0357
0.6674 15.9994 18736 0.1577 0.0258 0.2020 0.0348
0.6511 16.9994 19907 0.1571 0.0258 0.2087 0.0353
0.6399 17.9994 21078 0.1548 0.0258 0.2030 0.0347
0.6244 18.9994 22249 0.1561 0.0258 0.2125 0.0353
0.6078 19.9994 23420 0.1592 0.0258 0.2020 0.0345
0.5972 20.9994 24591 0.1552 0.0258 0.2023 0.0343
0.5839 21.9994 25762 0.1548 0.0258 0.2059 0.0347
0.5588 22.9994 26933 0.1544 0.0258 0.2005 0.0344
0.5593 23.9994 28104 0.1605 0.0258 0.1911 0.0343
0.546 24.9994 29275 0.1558 0.0258 0.1894 0.0340
0.5274 25.9994 30446 0.1635 0.0258 0.2103 0.0353
0.5224 26.9994 31617 0.1568 0.0258 0.1875 0.0331
0.5119 27.9994 32788 0.1603 0.0258 0.1918 0.0341
0.4973 28.9994 33959 0.1602 0.0258 0.1884 0.0336
0.4932 29.9994 35130 0.1592 0.0258 0.1856 0.0332
0.4745 30.9994 36301 0.1638 0.0258 0.1861 0.0336
0.4725 31.9994 37472 0.1654 0.0258 0.1970 0.0342
0.4587 32.9994 38643 0.1647 0.0258 0.1864 0.0333
0.4536 33.9994 39814 0.1666 0.0258 0.1834 0.0331
0.4495 34.9994 40985 0.1665 0.0258 0.1961 0.0339
0.4382 35.9994 42156 0.1646 0.0258 0.1887 0.0336
0.4339 36.9994 43327 0.1643 0.0258 0.1828 0.0329
0.4293 37.9994 44498 0.1709 0.0258 0.1856 0.0336
0.4087 38.9994 45669 0.1689 0.0258 0.1799 0.0326
0.4017 39.9994 46840 0.1638 0.0258 0.1824 0.0323
0.4019 40.9994 48011 0.1674 0.0258 0.1877 0.0335
0.3923 41.9994 49182 0.1646 0.0258 0.1867 0.0333
0.3847 42.9994 50353 0.1705 0.0258 0.1831 0.0328
0.3814 43.9994 51524 0.1723 0.0258 0.1814 0.0323
0.3748 44.9994 52695 0.1690 0.0258 0.1880 0.0334
0.3643 45.9994 53866 0.1706 0.0258 0.1830 0.0328
0.3592 46.9994 55037 0.1722 0.0258 0.1823 0.0328
0.3536 47.9994 56208 0.1725 0.0258 0.1834 0.0330
0.3547 48.9994 57379 0.1682 0.0258 0.1775 0.0325
0.344 49.9994 58550 0.1742 0.0258 0.1842 0.0328
0.3338 50.9994 59721 0.1750 0.0258 0.1793 0.0321
0.334 51.9994 60892 0.1700 0.0258 0.1800 0.0324
0.3267 52.9994 62063 0.1755 0.0258 0.1795 0.0326
0.3242 53.9994 63234 0.1756 0.0258 0.1762 0.0317
0.3126 54.9994 64405 0.1773 0.0258 0.1724 0.0312
0.3158 55.9994 65576 0.1745 0.0258 0.1742 0.0318
0.3071 56.9994 66747 0.1786 0.0258 0.1812 0.0326
0.3029 57.9994 67918 0.1770 0.0258 0.1754 0.0313
0.2996 58.9994 69089 0.1771 0.0258 0.1771 0.0321
0.2921 59.9994 70260 0.1807 0.0258 0.1736 0.0317
0.2875 60.9994 71431 0.1823 0.0258 0.1765 0.0317
0.2781 61.9994 72602 0.1768 0.0258 0.1764 0.0316
0.2817 62.9994 73773 0.1793 0.0258 0.1763 0.0318
0.2748 63.9994 74944 0.1842 0.0258 0.1760 0.0316
0.2752 64.9994 76115 0.1831 0.0258 0.1766 0.0321

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
4
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/mms-1B_all_NCHLT_XHOSA_20hr_v1

Finetuned
(214)
this model

Collection including asr-africa/mms-1B_all_NCHLT_XHOSA_20hr_v1

Evaluation results