You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

w2v-bert-2.0-lg-CV-Fleurs-100hrs-v10

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4800
  • Wer: 0.2754
  • Cer: 0.0597

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.2998 0.9999 4598 0.2779 0.3505 0.0725
0.1902 2.0 9197 0.2643 0.3263 0.0678
0.1601 2.9999 13795 0.2585 0.3423 0.0700
0.1419 4.0 18394 0.2475 0.3051 0.0646
0.1267 4.9999 22992 0.2369 0.3006 0.0623
0.1148 6.0 27591 0.2636 0.3207 0.0652
0.1043 6.9999 32189 0.2497 0.2922 0.0619
0.0954 8.0 36788 0.2418 0.2913 0.0616
0.0875 8.9999 41386 0.2486 0.2867 0.0612
0.0785 10.0 45985 0.2383 0.2877 0.0605
0.0703 10.9999 50583 0.2641 0.2882 0.0611
0.0627 12.0 55182 0.2756 0.2923 0.0620
0.0549 12.9999 59780 0.2818 0.2829 0.0605
0.0478 14.0 64379 0.2831 0.2798 0.0596
0.0417 14.9999 68977 0.2906 0.2969 0.0620
0.0362 16.0 73576 0.2961 0.2837 0.0599
0.0315 16.9999 78174 0.3248 0.2955 0.0633
0.0277 18.0 82773 0.3073 0.2977 0.0622
0.0246 18.9999 87371 0.3547 0.2825 0.0604
0.0225 20.0 91970 0.3551 0.2784 0.0599
0.02 20.9999 96568 0.3572 0.2856 0.0603
0.0178 22.0 101167 0.3783 0.2835 0.0609
0.0166 22.9999 105765 0.3610 0.2929 0.0612
0.0149 24.0 110364 0.3656 0.3045 0.0631
0.0139 24.9999 114962 0.3641 0.2902 0.0610
0.0126 26.0 119561 0.3587 0.2909 0.0626
0.0116 26.9999 124159 0.3998 0.2974 0.0613
0.011 28.0 128758 0.4153 0.2870 0.0602
0.0105 28.9999 133356 0.3980 0.2862 0.0628
0.0094 30.0 137955 0.4220 0.2810 0.0605
0.0088 30.9999 142553 0.4333 0.2770 0.0601
0.0081 32.0 147152 0.4179 0.2855 0.0615
0.0075 32.9999 151750 0.4235 0.2850 0.0606
0.0071 34.0 156349 0.4131 0.2808 0.0606
0.0066 34.9999 160947 0.4215 0.2731 0.0588
0.0064 36.0 165546 0.4150 0.2798 0.0595
0.0058 36.9999 170144 0.4405 0.2816 0.0607
0.0057 38.0 174743 0.3887 0.2827 0.0595
0.0052 38.9999 179341 0.4458 0.2786 0.0603
0.0048 40.0 183940 0.4544 0.2781 0.0591
0.005 40.9999 188538 0.4546 0.2763 0.0603
0.0044 42.0 193137 0.4522 0.2706 0.0589
0.0043 42.9999 197735 0.4669 0.2776 0.0601
0.0042 44.0 202334 0.4486 0.2766 0.0600
0.0037 44.9999 206932 0.4678 0.2778 0.0606
0.0036 46.0 211531 0.4310 0.2675 0.0583
0.0035 46.9999 216129 0.4902 0.2777 0.0597
0.0035 48.0 220728 0.4687 0.2726 0.0597
0.0033 48.9999 225326 0.4355 0.2796 0.0605
0.0028 50.0 229925 0.4475 0.2776 0.0597
0.0029 50.9999 234523 0.4452 0.2730 0.0583
0.0029 52.0 239122 0.4574 0.2713 0.0590
0.0026 52.9999 243720 0.4653 0.2776 0.0601
0.0023 54.0 248319 0.4505 0.2741 0.0591
0.0025 54.9999 252917 0.4754 0.2739 0.0592
0.0024 56.0 257516 0.4578 0.2687 0.0568
0.0023 56.9999 262114 0.4597 0.2704 0.0576
0.002 58.0 266713 0.4800 0.2754 0.0597

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
16
Safetensors
Model size
606M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/w2v-bert-2.0-lg-CV-Fleurs-100hrs-v10

Finetuned
(239)
this model

Collection including asr-africa/w2v-bert-2.0-lg-CV-Fleurs-100hrs-v10