shevek's picture
End of training
47a20a9 verified
|
raw
history blame
7.47 kB
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: my_awesome_speach_model
    results: []

my_awesome_speach_model

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2361
  • Accuracy: 0.6610

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.8 3 2.1052 0.0847
No log 1.7333 6 2.0712 0.1610
No log 2.6667 9 2.0195 0.6356
2.0648 3.8667 13 1.9087 0.6864
2.0648 4.8 16 1.8101 0.6864
2.0648 5.7333 19 1.7046 0.6864
1.8256 6.6667 22 1.5664 0.6864
1.8256 7.8667 26 1.4010 0.6864
1.8256 8.8 29 1.3322 0.6864
1.4713 9.7333 32 1.2896 0.6864
1.4713 10.6667 35 1.2617 0.6864
1.4713 11.8667 39 1.2342 0.6864
1.307 12.8 42 1.2206 0.6864
1.307 13.7333 45 1.2094 0.6864
1.307 14.6667 48 1.1998 0.6864
1.2241 15.8667 52 1.1920 0.6864
1.2241 16.8 55 1.1868 0.6864
1.2241 17.7333 58 1.1850 0.6864
1.2053 18.6667 61 1.2018 0.6864
1.2053 19.8667 65 1.1801 0.6864
1.2053 20.8 68 1.1851 0.6864
1.1815 21.7333 71 1.1699 0.6864
1.1815 22.6667 74 1.1746 0.6864
1.1815 23.8667 78 1.2902 0.6864
1.1471 24.8 81 1.1601 0.6864
1.1471 25.7333 84 1.1527 0.6864
1.1471 26.6667 87 1.1841 0.6864
1.1109 27.8667 91 1.1406 0.6864
1.1109 28.8 94 1.1454 0.6949
1.1109 29.7333 97 1.2087 0.6525
1.0994 30.6667 100 1.1712 0.6949
1.0994 31.8667 104 1.1769 0.7034
1.0994 32.8 107 1.1852 0.6949
1.0516 33.7333 110 1.2119 0.6780
1.0516 34.6667 113 1.1934 0.6949
1.0516 35.8667 117 1.2235 0.6610
1.0547 36.8 120 1.1929 0.6780
1.0547 37.7333 123 1.1711 0.6780
1.0547 38.6667 126 1.1893 0.6864
0.9975 39.8667 130 1.1604 0.6864
0.9975 40.8 133 1.1802 0.6864
0.9975 41.7333 136 1.1613 0.6864
0.9975 42.6667 139 1.1852 0.6780
0.9829 43.8667 143 1.1511 0.7119
0.9829 44.8 146 1.2872 0.6356
0.9829 45.7333 149 1.1891 0.6864
1.0212 46.6667 152 1.1853 0.6780
1.0212 47.8667 156 1.3700 0.6017
1.0212 48.8 159 1.2899 0.6271
1.012 49.7333 162 1.2226 0.6695
1.012 50.6667 165 1.2168 0.6695
1.012 51.8667 169 1.2985 0.6356
1.0166 52.8 172 1.2924 0.6441
1.0166 53.7333 175 1.2145 0.6525
1.0166 54.6667 178 1.2080 0.6695
0.9709 55.8667 182 1.3386 0.6356
0.9709 56.8 185 1.2637 0.6610
0.9709 57.7333 188 1.1988 0.6949
0.9882 58.6667 191 1.2233 0.6610
0.9882 59.8667 195 1.3560 0.6441
0.9882 60.8 198 1.3280 0.6441
0.9324 61.7333 201 1.2938 0.6271
0.9324 62.6667 204 1.2439 0.6610
0.9324 63.8667 208 1.3100 0.6271
0.9331 64.8 211 1.3142 0.6356
0.9331 65.7333 214 1.2808 0.6525
0.9331 66.6667 217 1.2599 0.6525
0.9155 67.8667 221 1.2801 0.6525
0.9155 68.8 224 1.2173 0.6864
0.9155 69.7333 227 1.2677 0.6525
0.88 70.6667 230 1.2324 0.6780
0.88 71.8667 234 1.1966 0.6780
0.88 72.8 237 1.2495 0.6695
0.9119 73.7333 240 1.2212 0.6695
0.9119 74.6667 243 1.2157 0.6695
0.9119 75.8667 247 1.2324 0.6610
0.8721 76.8 250 1.2343 0.6695
0.8721 77.7333 253 1.2306 0.6610
0.8721 78.6667 256 1.2322 0.6610
0.8741 79.8667 260 1.2413 0.6695
0.8741 80.8 263 1.2184 0.6949
0.8741 81.7333 266 1.2102 0.6864
0.8741 82.6667 269 1.2311 0.6780
0.8509 83.8667 273 1.2596 0.6525
0.8509 84.8 276 1.2589 0.6525
0.8509 85.7333 279 1.2425 0.6695
0.8614 86.6667 282 1.2361 0.6695
0.8614 87.8667 286 1.2317 0.6695
0.8614 88.8 289 1.2295 0.6695
0.8762 89.7333 292 1.2310 0.6695
0.8762 90.6667 295 1.2352 0.6695
0.8762 91.8667 299 1.2362 0.6610
0.8594 92.2667 300 1.2361 0.6610

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3