dysarthria-tiny-en / README.md
hiwden00's picture
Model save
4bcc934 verified
metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-tiny.en
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: dysarthria-tiny-en
    results: []

dysarthria-tiny-en

This model is a fine-tuned version of openai/whisper-tiny.en on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2870
  • Wer: 52.8131
  • Cer: 36.9112

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.0255 13.5135 500 0.4034 199.4555 178.1629
0.0005 27.0270 1000 0.2994 17.9673 11.4738
0.0002 40.5405 1500 0.2941 17.2414 11.1373
0.0002 54.0541 2000 0.2917 16.6969 10.7672
0.0001 67.5676 2500 0.2892 15.9710 10.3634
0.0001 81.0811 3000 0.2886 53.7205 37.3486
0.0001 94.5946 3500 0.2882 53.1760 37.2813
0.0001 108.1081 4000 0.2881 53.1760 37.3486
0.0001 121.6216 4500 0.2870 52.8131 36.9112
0.0 135.1351 5000 0.2870 52.8131 36.9112

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0