wav2vec2-base-finetune-vi-v4

This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2829
  • Wer: 0.1587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
5.0547 0.49 500 3.4652 1.0
2.9133 0.99 1000 1.1003 0.5035
1.0644 1.48 1500 0.5960 0.3199
0.8294 1.98 2000 0.5098 0.2809
0.6965 2.47 2500 0.5010 0.2596
0.6646 2.96 3000 0.4209 0.2398
0.5753 3.46 3500 0.4089 0.2361
0.5265 3.95 4000 0.3868 0.2195
0.4701 4.45 4500 0.3626 0.2171
0.4617 4.94 5000 0.3693 0.2160
0.4343 5.43 5500 0.3661 0.2058
0.4246 5.93 6000 0.3618 0.2067
0.3881 6.42 6500 0.3654 0.2044
0.3948 6.92 7000 0.3586 0.2009
0.367 7.41 7500 0.3431 0.1961
0.3449 7.91 8000 0.3196 0.1944
0.3168 8.4 8500 0.3310 0.1912
0.3393 8.89 9000 0.3418 0.1879
0.3197 9.39 9500 0.3434 0.1888
0.2954 9.88 10000 0.3371 0.1863
0.2968 10.38 10500 0.2941 0.1899
0.2802 10.87 11000 0.3095 0.1836
0.2783 11.36 11500 0.3275 0.1822
0.3027 11.86 12000 0.3103 0.1806
0.2645 12.35 12500 0.3247 0.1842
0.2958 12.85 13000 0.3242 0.1801
0.2648 13.34 13500 0.3169 0.1775
0.2461 13.83 14000 0.2926 0.1764
0.247 14.33 14500 0.3033 0.1741
0.2212 14.82 15000 0.2901 0.1749
0.2239 15.32 15500 0.3237 0.1758
0.2093 15.81 16000 0.2972 0.1759
0.2284 16.3 16500 0.3025 0.1749
0.228 16.8 17000 0.2862 0.1708
0.2033 17.29 17500 0.3039 0.1745
0.189 17.79 18000 0.3084 0.1708
0.1992 18.28 18500 0.2931 0.1735
0.1989 18.77 19000 0.2964 0.1693
0.1953 19.27 19500 0.3082 0.1715
0.1813 19.76 20000 0.2859 0.1702
0.1703 20.26 20500 0.2936 0.1680
0.1939 20.75 21000 0.2871 0.1684
0.1769 21.25 21500 0.2994 0.1646
0.1795 21.74 22000 0.2990 0.1669
0.17 22.23 22500 0.2839 0.1663
0.1507 22.73 23000 0.3125 0.1666
0.1676 23.22 23500 0.2867 0.1611
0.1675 23.72 24000 0.3099 0.1607
0.171 24.21 24500 0.3000 0.1627
0.1483 24.7 25000 0.3010 0.1629
0.1452 25.2 25500 0.2910 0.1641
0.1394 25.69 26000 0.2878 0.1605
0.1478 26.19 26500 0.2881 0.1617
0.1426 26.68 27000 0.2714 0.1607
0.1342 27.17 27500 0.2941 0.1615
0.1385 27.67 28000 0.2758 0.1594
0.1541 28.16 28500 0.2830 0.1592
0.153 28.66 29000 0.2789 0.1575
0.1359 29.15 29500 0.2819 0.1588
0.1276 29.64 30000 0.2829 0.1587

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.0
  • Datasets 2.8.0
  • Tokenizers 0.13.3
Downloads last month
26
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.