foxxy-hm's picture
Update README.md
d5a133a
|
raw
history blame
4.02 kB
metadata
license: cc-by-nc-4.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-base-finetune-vi-v2
    results: []
widget:
  - example_title: SOICT 2023 - SLU public test 1
    src: >-
      https://huggingface.co/foxxy-hm/wav2vec2-base-finetune-vi/raw/main/audio-test/055R7BruAa333g9teFfamQH.wav
  - example_title: SOICT 2023 - SLU public test 2
    src: >-
      https://huggingface.co/foxxy-hm/wav2vec2-base-finetune-vi/raw/main/audio-test/0BLHhoJexE8THB8BrsZxWbh.wav
  - example_title: SOICT 2023 - SLU public test 3
    src: >-
      https://huggingface.co/foxxy-hm/wav2vec2-base-finetune-vi/raw/main/audio-test/1ArUTGWJQ9YALH2xaNhU6GV.wav

wav2vec2-base-finetune-vi-v2

This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2188
  • Wer: 0.1391

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 24

Training results

Training Loss Epoch Step Validation Loss Wer
4.3873 0.67 500 2.4321 0.9719
1.4812 1.34 1000 0.5449 0.3062
0.7731 2.0 1500 0.3793 0.2263
0.542 2.67 2000 0.3021 0.2002
0.4461 3.34 2500 0.2905 0.1862
0.4175 4.01 3000 0.2687 0.1771
0.3878 4.67 3500 0.2958 0.1751
0.3373 5.34 4000 0.2713 0.1721
0.3046 6.01 4500 0.2505 0.1616
0.2933 6.68 5000 0.2561 0.1611
0.285 7.34 5500 0.2405 0.1617
0.2998 8.01 6000 0.2363 0.1578
0.2486 8.68 6500 0.2254 0.1570
0.2682 9.35 7000 0.2306 0.1547
0.2327 10.01 7500 0.2289 0.1537
0.2141 10.68 8000 0.2383 0.1499
0.2124 11.35 8500 0.2261 0.15
0.2156 12.02 9000 0.2142 0.1511
0.2082 12.68 9500 0.2386 0.1467
0.1814 13.35 10000 0.2301 0.1448
0.1836 14.02 10500 0.2302 0.1446
0.18 14.69 11000 0.2244 0.1445
0.1756 15.35 11500 0.2280 0.1439
0.1693 16.02 12000 0.2307 0.1426
0.1588 16.69 12500 0.2164 0.1422
0.1587 17.36 13000 0.2198 0.1417
0.1738 18.02 13500 0.2282 0.1411
0.1524 18.69 14000 0.2274 0.1394
0.1569 19.36 14500 0.2178 0.1396
0.1433 20.03 15000 0.2200 0.1413
0.1512 20.69 15500 0.2193 0.1382
0.1375 21.36 16000 0.2174 0.1393
0.1302 22.03 16500 0.2246 0.1391
0.146 22.7 17000 0.2222 0.1392
0.1265 23.36 17500 0.2188 0.1391

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.13.3