metadata
base_model: facebook/w2v-bert-2.0
library_name: transformers
license: mit
metrics:
- wer
tags:
- generated_from_trainer
model-index:
- name: w2v-bert-2.0-BIG_C-AMMI-BEMBA_SPEECH_CORPUS-BEMBA-189hrs-V1
results: []
w2v-bert-2.0-BIG_C-AMMI-BEMBA_SPEECH_CORPUS-BEMBA-189hrs-V1
This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7377
- Wer: 0.2954
- Cer: 0.0681
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 2
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.025
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Cer | Validation Loss | Wer |
---|---|---|---|---|---|
1.0463 | 1.0 | 22932 | 0.0921 | 0.3791 | 0.4291 |
0.4544 | 2.0 | 45864 | 0.0737 | 0.3468 | 0.3564 |
0.4096 | 3.0 | 68796 | 0.0671 | 0.3095 | 0.3097 |
0.3795 | 4.0 | 91728 | 0.0674 | 0.2922 | 0.3008 |
0.358 | 5.0 | 114660 | 0.0662 | 0.2915 | 0.2962 |
0.3387 | 6.0 | 137592 | 0.0636 | 0.2844 | 0.2832 |
0.3203 | 7.0 | 160524 | 0.0622 | 0.2843 | 0.2761 |
0.3 | 8.0 | 183456 | 0.0636 | 0.2880 | 0.2864 |
0.2778 | 9.0 | 206388 | 0.0630 | 0.2906 | 0.2782 |
0.2543 | 10.0 | 229320 | 0.0649 | 0.2986 | 0.2863 |
0.2312 | 11.0 | 252252 | 0.0643 | 0.3220 | 0.2829 |
0.2082 | 12.0 | 275184 | 0.0644 | 0.3376 | 0.2836 |
0.1864 | 13.0 | 298116 | 0.0653 | 0.3579 | 0.2832 |
0.167 | 14.0 | 321048 | 0.0641 | 0.3896 | 0.2836 |
0.1498 | 15.0 | 343980 | 0.0653 | 0.4124 | 0.2902 |
0.1351 | 16.0 | 366912 | 0.0649 | 0.4565 | 0.2852 |
0.1216 | 17.0 | 389844 | 0.0671 | 0.4517 | 0.2967 |
0.1102 | 18.0 | 412776 | 0.4959 | 0.2912 | 0.0659 |
0.0999 | 19.0 | 435708 | 0.5536 | 0.2909 | 0.0652 |
0.091 | 20.0 | 458640 | 0.5782 | 0.2932 | 0.0667 |
0.0828 | 21.0 | 481572 | 0.6136 | 0.2949 | 0.0663 |
0.0752 | 22.0 | 504504 | 0.6310 | 0.2900 | 0.0662 |
0.0679 | 23.0 | 527436 | 0.6588 | 0.2925 | 0.0659 |
0.0614 | 24.0 | 550368 | 0.6938 | 0.2945 | 0.0671 |
0.0559 | 25.0 | 573300 | 0.7247 | 0.2959 | 0.0667 |
0.0499 | 26.0 | 596232 | 0.7278 | 0.2927 | 0.0663 |
0.045 | 27.0 | 619164 | 0.7377 | 0.2954 | 0.0681 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.1.0+cu118
- Datasets 3.0.1
- Tokenizers 0.20.0