whisper-large-v2-ec

This model is a fine-tuned version of openai/whisper-large-v2 on the wanasash/enwaucymraeg default dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5119
  • Wer: 0.2167

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0112 13.6054 1000 0.3912 0.2395
0.0004 27.2109 2000 0.4532 0.2245
0.0002 40.8163 3000 0.4882 0.2175
0.0001 54.4218 4000 0.5051 0.2148
0.0001 68.0272 5000 0.5119 0.2167

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
1.54B params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for wanasash/whisper-large-v2-ec

Finetuned
(194)
this model

Dataset used to train wanasash/whisper-large-v2-ec

Evaluation results