wav2vec-repeat / README.md
arslanarjumand's picture
arslanarjumand/wav2vec-repeat
e29ae13 verified
metadata
license: mit
base_model: arslanarjumand/wav2vec-reptiles
tags:
  - generated_from_trainer
model-index:
  - name: wav2vec-repeat
    results: []

wav2vec-repeat

This model is a fine-tuned version of arslanarjumand/wav2vec-reptiles on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 205.9549
  • Pcc Accuracy: 0.8004
  • Pcc Fluency: 0.7759
  • Pcc Total Score: 0.8207
  • Pcc Content: 0.7220

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 4
  • eval_batch_size: 6
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.5
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Pcc Accuracy Pcc Fluency Pcc Total Score Pcc Content
507.295 3.54 500 538.7184 0.2592 0.2368 0.2807 0.3206
267.4833 7.08 1000 374.0983 0.5787 0.5582 0.5900 0.5040
246.7156 10.62 1500 483.3237 0.6618 0.6387 0.6761 0.5837
269.7238 14.16 2000 446.4642 0.6964 0.6691 0.7131 0.6288
289.3261 17.7 2500 244.4726 0.7201 0.6928 0.7371 0.6482
249.89 21.24 3000 413.8036 0.7340 0.7052 0.7548 0.6796
235.8593 24.78 3500 251.3629 0.7472 0.7217 0.7676 0.6808
217.7143 28.32 4000 212.4162 0.7779 0.7547 0.7973 0.6948
123.7326 31.86 4500 362.4697 0.7782 0.7528 0.7987 0.7062
132.7905 35.4 5000 228.9714 0.7826 0.7603 0.8021 0.6987
111.7989 38.94 5500 189.2367 0.7985 0.7754 0.8188 0.7169
104.5979 42.48 6000 271.8181 0.7929 0.7692 0.8143 0.7192
115.256 46.02 6500 220.4324 0.8008 0.7753 0.8209 0.7230
86.3804 49.56 7000 205.9549 0.8004 0.7759 0.8207 0.7220

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.17.1
  • Tokenizers 0.15.2