Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

Whisper Small NSC part 1,2,3 (1000 steps) - Jarrett Er

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2060
  • Wer: 9.5998

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • training_steps: 1000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.3689 0.2008 50 1.0720 62.3310
0.6001 0.4016 100 0.5734 27.9340
0.5761 0.6024 150 0.5917 19.3889
0.5189 0.8032 200 0.5905 20.2271
0.5282 1.0040 250 0.5710 19.7134
0.3914 1.2048 300 0.5133 18.0097
0.369 1.4056 350 0.4591 18.4694
0.3881 1.6064 400 0.4335 16.4684
0.3504 1.8072 450 0.3882 16.5765
0.339 2.0080 500 0.3516 14.6566
0.2437 2.2088 550 0.3359 13.7642
0.2482 2.4096 600 0.3126 13.1963
0.2327 2.6104 650 0.2898 12.7907
0.2041 2.8112 700 0.2685 11.7631
0.1824 3.0120 750 0.2538 10.9789
0.1457 3.2129 800 0.2479 10.5733
0.1617 3.4137 850 0.2304 10.9248
0.1571 3.6145 900 0.2167 9.5457
0.1564 3.8153 950 0.2087 9.3835
0.1503 4.0161 1000 0.2060 9.5998

Framework versions

  • PEFT 0.14.0
  • Transformers 4.45.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.1.dev0
  • Tokenizers 0.20.3
Downloads last month
15
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for Thecoder3281f/whisper-small-hi-nscpart123-1000

Adapter
(117)
this model