wav2vec2-large-xlsr-coraa-exp-14
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5485
- Wer: 0.3417
- Cer: 0.1776
- Per: 0.3310
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer | Per |
---|---|---|---|---|---|---|
37.6216 | 1.0 | 14 | 46.2465 | 1.0071 | 4.0178 | 1.0071 |
37.6216 | 2.0 | 28 | 44.1710 | 1.0041 | 2.7949 | 1.0039 |
37.6216 | 3.0 | 42 | 39.3999 | 1.0 | 0.9284 | 1.0 |
37.6216 | 4.0 | 56 | 31.8074 | 1.0 | 0.9589 | 1.0 |
37.6216 | 5.0 | 70 | 20.4155 | 1.0 | 0.9619 | 1.0 |
37.6216 | 6.0 | 84 | 9.6948 | 1.0 | 0.9619 | 1.0 |
37.6216 | 7.0 | 98 | 5.8283 | 1.0 | 0.9619 | 1.0 |
25.7765 | 8.0 | 112 | 4.5548 | 1.0 | 0.9619 | 1.0 |
25.7765 | 9.0 | 126 | 4.0980 | 1.0 | 0.9619 | 1.0 |
25.7765 | 10.0 | 140 | 3.8329 | 1.0 | 0.9619 | 1.0 |
25.7765 | 11.0 | 154 | 3.6665 | 1.0 | 0.9619 | 1.0 |
25.7765 | 12.0 | 168 | 3.5348 | 1.0 | 0.9619 | 1.0 |
25.7765 | 13.0 | 182 | 3.3872 | 1.0 | 0.9619 | 1.0 |
25.7765 | 14.0 | 196 | 3.2821 | 1.0 | 0.9619 | 1.0 |
3.8439 | 15.0 | 210 | 3.2034 | 1.0 | 0.9619 | 1.0 |
3.8439 | 16.0 | 224 | 3.1549 | 1.0 | 0.9619 | 1.0 |
3.8439 | 17.0 | 238 | 3.1379 | 1.0 | 0.9619 | 1.0 |
3.8439 | 18.0 | 252 | 3.0903 | 1.0 | 0.9619 | 1.0 |
3.8439 | 19.0 | 266 | 3.0685 | 1.0 | 0.9619 | 1.0 |
3.8439 | 20.0 | 280 | 3.0650 | 1.0 | 0.9619 | 1.0 |
3.8439 | 21.0 | 294 | 3.0469 | 1.0 | 0.9619 | 1.0 |
3.035 | 22.0 | 308 | 3.0442 | 1.0 | 0.9619 | 1.0 |
3.035 | 23.0 | 322 | 3.0488 | 1.0 | 0.9619 | 1.0 |
3.035 | 24.0 | 336 | 3.0284 | 1.0 | 0.9619 | 1.0 |
3.035 | 25.0 | 350 | 3.0219 | 1.0 | 0.9619 | 1.0 |
3.035 | 26.0 | 364 | 3.0185 | 1.0 | 0.9619 | 1.0 |
3.035 | 27.0 | 378 | 3.0074 | 1.0 | 0.9619 | 1.0 |
3.035 | 28.0 | 392 | 3.0130 | 1.0 | 0.9619 | 1.0 |
2.9429 | 29.0 | 406 | 3.0014 | 1.0 | 0.9619 | 1.0 |
2.9429 | 30.0 | 420 | 2.9969 | 1.0 | 0.9619 | 1.0 |
2.9429 | 31.0 | 434 | 3.0056 | 1.0 | 0.9619 | 1.0 |
2.9429 | 32.0 | 448 | 3.0042 | 1.0 | 0.9619 | 1.0 |
2.9429 | 33.0 | 462 | 2.9842 | 1.0 | 0.9619 | 1.0 |
2.9429 | 34.0 | 476 | 2.9850 | 1.0 | 0.9619 | 1.0 |
2.9429 | 35.0 | 490 | 2.9796 | 1.0 | 0.9619 | 1.0 |
2.9201 | 36.0 | 504 | 2.9647 | 1.0 | 0.9619 | 1.0 |
2.9201 | 37.0 | 518 | 2.9327 | 1.0 | 0.9619 | 1.0 |
2.9201 | 38.0 | 532 | 2.8997 | 1.0 | 0.9619 | 1.0 |
2.9201 | 39.0 | 546 | 2.8702 | 1.0 | 0.9619 | 1.0 |
2.9201 | 40.0 | 560 | 2.8146 | 1.0 | 0.9619 | 1.0 |
2.9201 | 41.0 | 574 | 2.6820 | 1.0 | 0.9600 | 1.0 |
2.9201 | 42.0 | 588 | 2.5411 | 1.0 | 0.9084 | 1.0 |
2.7676 | 43.0 | 602 | 2.3459 | 1.0 | 0.7805 | 1.0 |
2.7676 | 44.0 | 616 | 2.0668 | 1.0 | 0.5817 | 1.0 |
2.7676 | 45.0 | 630 | 1.7420 | 1.0 | 0.4851 | 1.0 |
2.7676 | 46.0 | 644 | 1.4977 | 1.0 | 0.4356 | 1.0 |
2.7676 | 47.0 | 658 | 1.2985 | 0.9986 | 0.3912 | 0.9986 |
2.7676 | 48.0 | 672 | 1.1510 | 0.9441 | 0.3408 | 0.9403 |
2.7676 | 49.0 | 686 | 1.0227 | 0.7387 | 0.2760 | 0.7249 |
1.6224 | 50.0 | 700 | 0.9463 | 0.5246 | 0.2268 | 0.4965 |
1.6224 | 51.0 | 714 | 0.8637 | 0.4951 | 0.2179 | 0.4640 |
1.6224 | 52.0 | 728 | 0.8114 | 0.4732 | 0.2128 | 0.4449 |
1.6224 | 53.0 | 742 | 0.7687 | 0.4388 | 0.2038 | 0.4102 |
1.6224 | 54.0 | 756 | 0.7736 | 0.4340 | 0.2035 | 0.4063 |
1.6224 | 55.0 | 770 | 0.7564 | 0.4275 | 0.2018 | 0.4019 |
1.6224 | 56.0 | 784 | 0.7419 | 0.4210 | 0.2015 | 0.3988 |
1.6224 | 57.0 | 798 | 0.7089 | 0.4074 | 0.1982 | 0.3856 |
0.7727 | 58.0 | 812 | 0.6822 | 0.4061 | 0.1962 | 0.3866 |
0.7727 | 59.0 | 826 | 0.6505 | 0.4067 | 0.1952 | 0.3866 |
0.7727 | 60.0 | 840 | 0.6542 | 0.3933 | 0.1938 | 0.3767 |
0.7727 | 61.0 | 854 | 0.6346 | 0.3958 | 0.1926 | 0.3789 |
0.7727 | 62.0 | 868 | 0.6301 | 0.3885 | 0.1912 | 0.3722 |
0.7727 | 63.0 | 882 | 0.6702 | 0.3944 | 0.1944 | 0.3755 |
0.7727 | 64.0 | 896 | 0.6394 | 0.3846 | 0.1905 | 0.3675 |
0.5159 | 65.0 | 910 | 0.6235 | 0.3775 | 0.1892 | 0.3606 |
0.5159 | 66.0 | 924 | 0.6329 | 0.3795 | 0.1911 | 0.3633 |
0.5159 | 67.0 | 938 | 0.6074 | 0.3732 | 0.1891 | 0.3582 |
0.5159 | 68.0 | 952 | 0.5993 | 0.3742 | 0.1876 | 0.3592 |
0.5159 | 69.0 | 966 | 0.6088 | 0.3663 | 0.1871 | 0.3513 |
0.5159 | 70.0 | 980 | 0.6132 | 0.3771 | 0.1888 | 0.3606 |
0.5159 | 71.0 | 994 | 0.6205 | 0.3779 | 0.1892 | 0.3616 |
0.41 | 72.0 | 1008 | 0.6048 | 0.3732 | 0.1885 | 0.3562 |
0.41 | 73.0 | 1022 | 0.5851 | 0.3700 | 0.1873 | 0.3543 |
0.41 | 74.0 | 1036 | 0.5975 | 0.3706 | 0.1872 | 0.3543 |
0.41 | 75.0 | 1050 | 0.5996 | 0.3722 | 0.1890 | 0.3555 |
0.41 | 76.0 | 1064 | 0.5951 | 0.3690 | 0.1874 | 0.3533 |
0.41 | 77.0 | 1078 | 0.5791 | 0.3637 | 0.1857 | 0.3486 |
0.41 | 78.0 | 1092 | 0.5675 | 0.3616 | 0.1851 | 0.3466 |
0.371 | 79.0 | 1106 | 0.6022 | 0.3659 | 0.1880 | 0.3486 |
0.371 | 80.0 | 1120 | 0.5954 | 0.3669 | 0.1854 | 0.3507 |
0.371 | 81.0 | 1134 | 0.5832 | 0.3629 | 0.1841 | 0.3470 |
0.371 | 82.0 | 1148 | 0.5867 | 0.3620 | 0.1843 | 0.3456 |
0.371 | 83.0 | 1162 | 0.5971 | 0.3669 | 0.1870 | 0.3519 |
0.371 | 84.0 | 1176 | 0.5926 | 0.3633 | 0.1859 | 0.3478 |
0.371 | 85.0 | 1190 | 0.5774 | 0.3596 | 0.1837 | 0.3438 |
0.3317 | 86.0 | 1204 | 0.5779 | 0.3610 | 0.1846 | 0.3462 |
0.3317 | 87.0 | 1218 | 0.5797 | 0.3604 | 0.1845 | 0.3446 |
0.3317 | 88.0 | 1232 | 0.5750 | 0.3578 | 0.1843 | 0.3425 |
0.3317 | 89.0 | 1246 | 0.5651 | 0.3584 | 0.1824 | 0.3436 |
0.3317 | 90.0 | 1260 | 0.5749 | 0.3592 | 0.1830 | 0.3427 |
0.3317 | 91.0 | 1274 | 0.5791 | 0.3572 | 0.1834 | 0.3419 |
0.3317 | 92.0 | 1288 | 0.5589 | 0.3541 | 0.1814 | 0.3389 |
0.3039 | 93.0 | 1302 | 0.5670 | 0.3543 | 0.1816 | 0.3387 |
0.3039 | 94.0 | 1316 | 0.5619 | 0.3521 | 0.1805 | 0.3385 |
0.3039 | 95.0 | 1330 | 0.5628 | 0.3539 | 0.1801 | 0.3393 |
0.3039 | 96.0 | 1344 | 0.5800 | 0.3572 | 0.1820 | 0.3417 |
0.3039 | 97.0 | 1358 | 0.5605 | 0.3509 | 0.1805 | 0.3371 |
0.3039 | 98.0 | 1372 | 0.5619 | 0.3513 | 0.1805 | 0.3369 |
0.3039 | 99.0 | 1386 | 0.5704 | 0.3523 | 0.1825 | 0.3373 |
0.2596 | 100.0 | 1400 | 0.5618 | 0.3531 | 0.1810 | 0.3393 |
0.2596 | 101.0 | 1414 | 0.5591 | 0.3458 | 0.1799 | 0.3310 |
0.2596 | 102.0 | 1428 | 0.5675 | 0.3492 | 0.1817 | 0.3352 |
0.2596 | 103.0 | 1442 | 0.5614 | 0.3537 | 0.1808 | 0.3397 |
0.2596 | 104.0 | 1456 | 0.5652 | 0.3527 | 0.1810 | 0.3389 |
0.2596 | 105.0 | 1470 | 0.5576 | 0.3497 | 0.1798 | 0.3354 |
0.2596 | 106.0 | 1484 | 0.5653 | 0.3499 | 0.1794 | 0.3360 |
0.2596 | 107.0 | 1498 | 0.5543 | 0.3474 | 0.1796 | 0.3324 |
0.2488 | 108.0 | 1512 | 0.5540 | 0.3484 | 0.1794 | 0.3344 |
0.2488 | 109.0 | 1526 | 0.5626 | 0.3484 | 0.1804 | 0.3346 |
0.2488 | 110.0 | 1540 | 0.5648 | 0.3499 | 0.1809 | 0.3352 |
0.2488 | 111.0 | 1554 | 0.5588 | 0.3495 | 0.1803 | 0.3358 |
0.2488 | 112.0 | 1568 | 0.5574 | 0.3466 | 0.1790 | 0.3338 |
0.2488 | 113.0 | 1582 | 0.5624 | 0.3486 | 0.1798 | 0.3358 |
0.2488 | 114.0 | 1596 | 0.5538 | 0.3488 | 0.1791 | 0.3348 |
0.2409 | 115.0 | 1610 | 0.5577 | 0.3474 | 0.1790 | 0.3332 |
0.2409 | 116.0 | 1624 | 0.5612 | 0.3472 | 0.1793 | 0.3322 |
0.2409 | 117.0 | 1638 | 0.5640 | 0.3482 | 0.1797 | 0.3334 |
0.2409 | 118.0 | 1652 | 0.5598 | 0.3484 | 0.1791 | 0.3346 |
0.2409 | 119.0 | 1666 | 0.5660 | 0.3468 | 0.1792 | 0.3334 |
0.2409 | 120.0 | 1680 | 0.5578 | 0.3460 | 0.1784 | 0.3328 |
0.2409 | 121.0 | 1694 | 0.5592 | 0.3466 | 0.1790 | 0.3338 |
0.2278 | 122.0 | 1708 | 0.5624 | 0.3456 | 0.1788 | 0.3332 |
0.2278 | 123.0 | 1722 | 0.5639 | 0.3436 | 0.1789 | 0.3308 |
0.2278 | 124.0 | 1736 | 0.5486 | 0.3438 | 0.1781 | 0.3320 |
0.2278 | 125.0 | 1750 | 0.5649 | 0.3417 | 0.1790 | 0.3291 |
0.2278 | 126.0 | 1764 | 0.5518 | 0.3425 | 0.1783 | 0.3302 |
0.2278 | 127.0 | 1778 | 0.5538 | 0.3415 | 0.1780 | 0.3295 |
0.2278 | 128.0 | 1792 | 0.5591 | 0.3413 | 0.1779 | 0.3302 |
0.2261 | 129.0 | 1806 | 0.5575 | 0.3417 | 0.1780 | 0.3295 |
0.2261 | 130.0 | 1820 | 0.5548 | 0.3403 | 0.1778 | 0.3287 |
0.2261 | 131.0 | 1834 | 0.5608 | 0.3401 | 0.1781 | 0.3289 |
0.2261 | 132.0 | 1848 | 0.5485 | 0.3417 | 0.1776 | 0.3310 |
0.2261 | 133.0 | 1862 | 0.5508 | 0.3393 | 0.1776 | 0.3281 |
0.2261 | 134.0 | 1876 | 0.5572 | 0.3383 | 0.1777 | 0.3269 |
0.2261 | 135.0 | 1890 | 0.5605 | 0.3389 | 0.1783 | 0.3269 |
0.2015 | 136.0 | 1904 | 0.5549 | 0.3387 | 0.1776 | 0.3271 |
0.2015 | 137.0 | 1918 | 0.5534 | 0.3399 | 0.1775 | 0.3281 |
0.2015 | 138.0 | 1932 | 0.5525 | 0.3385 | 0.1775 | 0.3271 |
0.2015 | 139.0 | 1946 | 0.5551 | 0.3383 | 0.1774 | 0.3269 |
0.2015 | 140.0 | 1960 | 0.5531 | 0.3375 | 0.1770 | 0.3259 |
0.2015 | 141.0 | 1974 | 0.5504 | 0.3354 | 0.1765 | 0.3236 |
0.2015 | 142.0 | 1988 | 0.5518 | 0.3379 | 0.1770 | 0.3259 |
0.2036 | 143.0 | 2002 | 0.5525 | 0.3377 | 0.1770 | 0.3255 |
0.2036 | 144.0 | 2016 | 0.5528 | 0.3383 | 0.1772 | 0.3269 |
0.2036 | 145.0 | 2030 | 0.5534 | 0.3379 | 0.1773 | 0.3263 |
0.2036 | 146.0 | 2044 | 0.5548 | 0.3373 | 0.1774 | 0.3259 |
0.2036 | 147.0 | 2058 | 0.5542 | 0.3364 | 0.1771 | 0.3249 |
0.2036 | 148.0 | 2072 | 0.5550 | 0.3373 | 0.1772 | 0.3257 |
0.2036 | 149.0 | 2086 | 0.5555 | 0.3373 | 0.1772 | 0.3257 |
0.2013 | 150.0 | 2100 | 0.5558 | 0.3383 | 0.1774 | 0.3265 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.13.3
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.