wav2vec2-large-xlsr-coraa-exp-16

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0160
  • Wer: 1.0
  • Cer: 0.9619
  • Per: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer Per
38.847 1.0 14 48.0649 1.0002 3.1169 1.0002
38.847 2.0 28 47.9396 1.0002 3.1191 1.0002
38.847 3.0 42 47.7446 1.0004 3.0938 1.0004
38.847 4.0 56 47.3282 1.0006 3.0227 1.0006
38.847 5.0 70 46.4748 1.0004 2.4135 1.0004
38.847 6.0 84 44.8749 1.0 1.4413 1.0
38.847 7.0 98 42.6022 1.0 0.9460 1.0
37.4633 8.0 112 39.6558 1.0 0.8978 1.0
37.4633 9.0 126 35.7305 1.0 0.9338 1.0
37.4633 10.0 140 30.8796 1.0 0.9501 1.0
37.4633 11.0 154 26.3341 1.0 0.9510 1.0
37.4633 12.0 168 23.4116 1.0 0.9510 1.0
37.4633 13.0 182 21.1265 1.0 0.9510 1.0
37.4633 14.0 196 19.4607 1.0 0.9510 1.0
24.5313 15.0 210 18.0357 1.0 0.9510 1.0
24.5313 16.0 224 16.9469 1.0 0.9510 1.0
24.5313 17.0 238 16.1513 1.0 0.9510 1.0
24.5313 18.0 252 15.5951 1.0 0.9510 1.0
24.5313 19.0 266 15.0803 1.0 0.9510 1.0
24.5313 20.0 280 14.8270 1.0 0.9510 1.0
24.5313 21.0 294 14.4665 1.0 0.9510 1.0
14.1395 22.0 308 14.3680 1.0 0.9510 1.0
14.1395 23.0 322 14.1886 1.0 0.9510 1.0
14.1395 24.0 336 14.0514 1.0 0.9510 1.0
14.1395 25.0 350 14.0722 1.0 0.9510 1.0
14.1395 26.0 364 13.8134 1.0 0.9511 1.0
14.1395 27.0 378 13.6935 1.0 0.9522 1.0
14.1395 28.0 392 13.3832 1.0 0.9594 1.0
12.0999 29.0 406 12.9235 1.0 0.9567 1.0
12.0999 30.0 420 12.5604 1.0 0.9516 1.0
12.0999 31.0 434 10.6877 1.0 0.9469 1.0
12.0999 32.0 448 9.3758 1.0 0.9565 1.0
12.0999 33.0 462 5.0729 1.0 0.9619 1.0
12.0999 34.0 476 4.0927 1.0 0.9619 1.0
12.0999 35.0 490 3.8576 1.0 0.9619 1.0
7.922 36.0 504 3.7497 1.0 0.9619 1.0
7.922 37.0 518 3.6650 1.0 0.9619 1.0
7.922 38.0 532 3.5863 1.0 0.9619 1.0
7.922 39.0 546 3.5280 1.0 0.9619 1.0
7.922 40.0 560 3.4813 1.0 0.9619 1.0
7.922 41.0 574 3.4481 1.0 0.9619 1.0
7.922 42.0 588 3.4184 1.0 0.9619 1.0
3.5155 43.0 602 3.3964 1.0 0.9619 1.0
3.5155 44.0 616 3.3748 1.0 0.9619 1.0
3.5155 45.0 630 3.3545 1.0 0.9619 1.0
3.5155 46.0 644 3.3354 1.0 0.9619 1.0
3.5155 47.0 658 3.3090 1.0 0.9619 1.0
3.5155 48.0 672 3.2789 1.0 0.9619 1.0
3.5155 49.0 686 3.2441 1.0 0.9619 1.0
3.2278 50.0 700 3.2153 1.0 0.9619 1.0
3.2278 51.0 714 3.1921 1.0 0.9619 1.0
3.2278 52.0 728 3.1863 1.0 0.9619 1.0
3.2278 53.0 742 3.1605 1.0 0.9619 1.0
3.2278 54.0 756 3.1517 1.0 0.9619 1.0
3.2278 55.0 770 3.1389 1.0 0.9619 1.0
3.2278 56.0 784 3.1274 1.0 0.9619 1.0
3.2278 57.0 798 3.1237 1.0 0.9619 1.0
3.0881 58.0 812 3.1115 1.0 0.9619 1.0
3.0881 59.0 826 3.1051 1.0 0.9619 1.0
3.0881 60.0 840 3.1055 1.0 0.9619 1.0
3.0881 61.0 854 3.0982 1.0 0.9619 1.0
3.0881 62.0 868 3.0933 1.0 0.9619 1.0
3.0881 63.0 882 3.0871 1.0 0.9619 1.0
3.0881 64.0 896 3.0788 1.0 0.9619 1.0
3.0331 65.0 910 3.0835 1.0 0.9619 1.0
3.0331 66.0 924 3.0786 1.0 0.9619 1.0
3.0331 67.0 938 3.0781 1.0 0.9619 1.0
3.0331 68.0 952 3.0761 1.0 0.9619 1.0
3.0331 69.0 966 3.0663 1.0 0.9619 1.0
3.0331 70.0 980 3.0629 1.0 0.9619 1.0
3.0331 71.0 994 3.0661 1.0 0.9619 1.0
2.9941 72.0 1008 3.0600 1.0 0.9619 1.0
2.9941 73.0 1022 3.0559 1.0 0.9619 1.0
2.9941 74.0 1036 3.0517 1.0 0.9619 1.0
2.9941 75.0 1050 3.0524 1.0 0.9619 1.0
2.9941 76.0 1064 3.0506 1.0 0.9619 1.0
2.9941 77.0 1078 3.0451 1.0 0.9619 1.0
2.9941 78.0 1092 3.0485 1.0 0.9619 1.0
2.9748 79.0 1106 3.0472 1.0 0.9619 1.0
2.9748 80.0 1120 3.0464 1.0 0.9619 1.0
2.9748 81.0 1134 3.0458 1.0 0.9619 1.0
2.9748 82.0 1148 3.0386 1.0 0.9619 1.0
2.9748 83.0 1162 3.0376 1.0 0.9619 1.0
2.9748 84.0 1176 3.0365 1.0 0.9619 1.0
2.9748 85.0 1190 3.0414 1.0 0.9619 1.0
2.9573 86.0 1204 3.0400 1.0 0.9619 1.0
2.9573 87.0 1218 3.0327 1.0 0.9619 1.0
2.9573 88.0 1232 3.0354 1.0 0.9619 1.0
2.9573 89.0 1246 3.0313 1.0 0.9619 1.0
2.9573 90.0 1260 3.0344 1.0 0.9619 1.0
2.9573 91.0 1274 3.0385 1.0 0.9619 1.0
2.9573 92.0 1288 3.0343 1.0 0.9619 1.0
2.957 93.0 1302 3.0365 1.0 0.9619 1.0
2.957 94.0 1316 3.0292 1.0 0.9619 1.0
2.957 95.0 1330 3.0238 1.0 0.9619 1.0
2.957 96.0 1344 3.0332 1.0 0.9619 1.0
2.957 97.0 1358 3.0295 1.0 0.9619 1.0
2.957 98.0 1372 3.0305 1.0 0.9619 1.0
2.957 99.0 1386 3.0284 1.0 0.9619 1.0
2.9439 100.0 1400 3.0302 1.0 0.9619 1.0
2.9439 101.0 1414 3.0284 1.0 0.9619 1.0
2.9439 102.0 1428 3.0302 1.0 0.9619 1.0
2.9439 103.0 1442 3.0312 1.0 0.9619 1.0
2.9439 104.0 1456 3.0255 1.0 0.9619 1.0
2.9439 105.0 1470 3.0309 1.0 0.9619 1.0
2.9439 106.0 1484 3.0268 1.0 0.9619 1.0
2.9439 107.0 1498 3.0318 1.0 0.9619 1.0
2.9382 108.0 1512 3.0244 1.0 0.9619 1.0
2.9382 109.0 1526 3.0307 1.0 0.9619 1.0
2.9382 110.0 1540 3.0229 1.0 0.9619 1.0
2.9382 111.0 1554 3.0231 1.0 0.9619 1.0
2.9382 112.0 1568 3.0288 1.0 0.9619 1.0
2.9382 113.0 1582 3.0191 1.0 0.9619 1.0
2.9382 114.0 1596 3.0276 1.0 0.9619 1.0
2.9379 115.0 1610 3.0226 1.0 0.9619 1.0
2.9379 116.0 1624 3.0271 1.0 0.9619 1.0
2.9379 117.0 1638 3.0220 1.0 0.9619 1.0
2.9379 118.0 1652 3.0240 1.0 0.9619 1.0
2.9379 119.0 1666 3.0305 1.0 0.9619 1.0
2.9379 120.0 1680 3.0160 1.0 0.9619 1.0
2.9379 121.0 1694 3.0231 1.0 0.9619 1.0
2.9353 122.0 1708 3.0200 1.0 0.9619 1.0
2.9353 123.0 1722 3.0191 1.0 0.9619 1.0
2.9353 124.0 1736 3.0240 1.0 0.9619 1.0
2.9353 125.0 1750 3.0204 1.0 0.9619 1.0
2.9353 126.0 1764 3.0222 1.0 0.9619 1.0
2.9353 127.0 1778 3.0249 1.0 0.9619 1.0
2.9353 128.0 1792 3.0212 1.0 0.9619 1.0
2.9377 129.0 1806 3.0228 1.0 0.9619 1.0
2.9377 130.0 1820 3.0219 1.0 0.9619 1.0
2.9377 131.0 1834 3.0206 1.0 0.9619 1.0
2.9377 132.0 1848 3.0238 1.0 0.9619 1.0
2.9377 133.0 1862 3.0212 1.0 0.9619 1.0
2.9377 134.0 1876 3.0241 1.0 0.9619 1.0
2.9377 135.0 1890 3.0248 1.0 0.9619 1.0
2.929 136.0 1904 3.0250 1.0 0.9619 1.0
2.929 137.0 1918 3.0218 1.0 0.9619 1.0
2.929 138.0 1932 3.0230 1.0 0.9619 1.0
2.929 139.0 1946 3.0240 1.0 0.9619 1.0
2.929 140.0 1960 3.0226 1.0 0.9619 1.0

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
9
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.