utakumi's picture
End of training
2c61358 verified
metadata
library_name: transformers
language:
  - ja
license: apache-2.0
base_model: rinna/japanese-hubert-base
tags:
  - automatic-speech-recognition
  - mozilla-foundation/common_voice_13_0
  - generated_from_trainer
datasets:
  - common_voice_13_0
metrics:
  - wer
model-index:
  - name: Hubert-common_voice-ja-demo-phonemes-cosine-3e-5
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: MOZILLA-FOUNDATION/COMMON_VOICE_13_0 - JA
          type: common_voice_13_0
          config: ja
          split: test
          args: 'Config: ja, Training split: train+validation, Eval split: test'
        metrics:
          - name: Wer
            type: wer
            value: 1

Hubert-common_voice-ja-demo-phonemes-cosine-3e-5

This model is a fine-tuned version of rinna/japanese-hubert-base on the MOZILLA-FOUNDATION/COMMON_VOICE_13_0 - JA dataset. It achieves the following results on the evaluation set:

  • Loss: inf
  • Wer: 1.0
  • Cer: 0.2359

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 12500
  • num_epochs: 20.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 0.2660 100 inf 1.8204 4.9067
No log 0.5319 200 inf 1.5926 4.6323
No log 0.7979 300 inf 1.1770 1.9637
No log 1.0638 400 inf 1.0 0.9817
14.493 1.3298 500 inf 1.0 0.9817
14.493 1.5957 600 inf 1.0 0.9817
14.493 1.8617 700 inf 1.0 0.9817
14.493 2.1277 800 inf 1.0 0.9817
14.493 2.3936 900 inf 1.0 0.9817
6.5744 2.6596 1000 6.8080 1.0 0.9817
6.5744 2.9255 1100 6.5972 1.0 0.9817
6.5744 3.1915 1200 inf 1.0 0.9817
6.5744 3.4574 1300 inf 1.0 0.9817
6.5744 3.7234 1400 inf 1.0 0.9817
5.5193 3.9894 1500 inf 1.0 0.9817
5.5193 4.2553 1600 inf 1.0 0.9817
5.5193 4.5213 1700 inf 1.0 0.9817
5.5193 4.7872 1800 inf 1.0 0.9817
5.5193 5.0532 1900 inf 1.0 0.9817
4.5578 5.3191 2000 inf 1.0 0.9817
4.5578 5.5851 2100 inf 1.0 0.9817
4.5578 5.8511 2200 inf 1.0 0.9817
4.5578 6.1170 2300 inf 1.0 0.9817
4.5578 6.3830 2400 inf 1.0 0.9817
3.6943 6.6489 2500 inf 1.0 0.9817
3.6943 6.9149 2600 inf 1.0 0.9817
3.6943 7.1809 2700 inf 1.0 0.9817
3.6943 7.4468 2800 inf 1.0 0.9817
3.6943 7.7128 2900 3.1572 1.0 0.9817
3.1932 7.9787 3000 inf 1.0 0.9817
3.1932 8.2447 3100 inf 1.0 0.9817
3.1932 8.5106 3200 inf 1.0 0.9817
3.1932 8.7766 3300 inf 1.0 0.9817
3.1932 9.0426 3400 inf 1.0 0.9817
3.0309 9.3085 3500 inf 1.0 0.9817
3.0309 9.5745 3600 inf 1.0 0.9817
3.0309 9.8404 3700 inf 1.0 0.9817
3.0309 10.1064 3800 inf 1.0 0.9817
3.0309 10.3723 3900 inf 1.0 0.9817
2.9704 10.6383 4000 inf 1.0 0.9817
2.9704 10.9043 4100 inf 1.0 0.9817
2.9704 11.1702 4200 inf 1.0 0.9049
2.9704 11.4362 4300 inf 1.0 0.7254
2.9704 11.7021 4400 inf 1.0 0.4365
2.2767 11.9681 4500 1.5675 1.0 0.3732
2.2767 12.2340 4600 inf 1.0 0.3455
2.2767 12.5 4700 inf 1.0 0.3277
2.2767 12.7660 4800 inf 1.0 0.3053
2.2767 13.0319 4900 inf 1.0 0.2935
1.2873 13.2979 5000 inf 1.0 0.2784
1.2873 13.5638 5100 inf 1.0 0.2684
1.2873 13.8298 5200 inf 1.0 0.2678
1.2873 14.0957 5300 inf 1.0 0.2616
1.2873 14.3617 5400 0.8214 1.0 0.2608
0.9318 14.6277 5500 inf 1.0 0.2564
0.9318 14.8936 5600 inf 1.0 0.2544
0.9318 15.1596 5700 inf 1.0 0.2525
0.9318 15.4255 5800 inf 1.0 0.2510
0.9318 15.6915 5900 inf 1.0 0.2527
0.754 15.9574 6000 inf 1.0 0.2499
0.754 16.2234 6100 0.6672 1.0 0.2485
0.754 16.4894 6200 inf 1.0 0.2464
0.754 16.7553 6300 inf 1.0 0.2467
0.754 17.0213 6400 inf 1.0 0.2411
0.6421 17.2872 6500 inf 1.0 0.2411
0.6421 17.5532 6600 inf 1.0 0.2418
0.6421 17.8191 6700 inf 1.0 0.2386
0.6421 18.0851 6800 inf 0.9996 0.2387
0.6421 18.3511 6900 inf 1.0 0.2381
0.568 18.6170 7000 inf 1.0 0.2391
0.568 18.8830 7100 inf 1.0 0.2370
0.568 19.1489 7200 inf 1.0 0.2344
0.568 19.4149 7300 inf 1.0 0.2364
0.568 19.6809 7400 inf 1.0 0.2347
0.5259 19.9468 7500 inf 1.0 0.2334

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.3