File size: 4,773 Bytes
1630581
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
license: cc-by-nc-4.0
base_model: facebook/mms-1b-all
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-large-mms-1b-kazakh-speech2ner-kscsyn-8b-4ep
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-large-mms-1b-kazakh-speech2ner-kscsyn-8b-4ep

This model is a fine-tuned version of [facebook/mms-1b-all](https://huggingface.co/facebook/mms-1b-all) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
- Wer: 1.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 4

### Training results

| Training Loss | Epoch | Step   | Validation Loss | Wer    |
|:-------------:|:-----:|:------:|:---------------:|:------:|
| 6.6358        | 0.07  | 2000   | 6.5080          | 1.0000 |
| 6.6338        | 0.15  | 4000   | 6.5080          | 1.0000 |
| 0.0           | 0.22  | 6000   | nan             | 1.0    |
| 0.0           | 0.3   | 8000   | nan             | 1.0    |
| 0.0           | 0.37  | 10000  | nan             | 1.0    |
| 0.0           | 0.44  | 12000  | nan             | 1.0    |
| 0.0           | 0.52  | 14000  | nan             | 1.0    |
| 0.0           | 0.59  | 16000  | nan             | 1.0    |
| 0.0           | 0.66  | 18000  | nan             | 1.0    |
| 0.0           | 0.74  | 20000  | nan             | 1.0    |
| 0.0           | 0.81  | 22000  | nan             | 1.0    |
| 0.0           | 0.89  | 24000  | nan             | 1.0    |
| 0.0           | 0.96  | 26000  | nan             | 1.0    |
| 0.0           | 1.03  | 28000  | nan             | 1.0    |
| 0.0           | 1.11  | 30000  | nan             | 1.0    |
| 0.0           | 1.18  | 32000  | nan             | 1.0    |
| 0.0           | 1.25  | 34000  | nan             | 1.0    |
| 0.0           | 1.33  | 36000  | nan             | 1.0    |
| 0.0           | 1.4   | 38000  | nan             | 1.0    |
| 0.0           | 1.48  | 40000  | nan             | 1.0    |
| 0.0           | 1.55  | 42000  | nan             | 1.0    |
| 0.0           | 1.62  | 44000  | nan             | 1.0    |
| 0.0           | 1.7   | 46000  | nan             | 1.0    |
| 0.0           | 1.77  | 48000  | nan             | 1.0    |
| 0.0           | 1.84  | 50000  | nan             | 1.0    |
| 0.0           | 1.92  | 52000  | nan             | 1.0    |
| 0.0           | 1.99  | 54000  | nan             | 1.0    |
| 0.0           | 2.07  | 56000  | nan             | 1.0    |
| 0.0           | 2.14  | 58000  | nan             | 1.0    |
| 0.0           | 2.21  | 60000  | nan             | 1.0    |
| 0.0           | 2.29  | 62000  | nan             | 1.0    |
| 0.0           | 2.36  | 64000  | nan             | 1.0    |
| 0.0           | 2.43  | 66000  | nan             | 1.0    |
| 0.0           | 2.51  | 68000  | nan             | 1.0    |
| 0.0           | 2.58  | 70000  | nan             | 1.0    |
| 0.0           | 2.66  | 72000  | nan             | 1.0    |
| 0.0           | 2.73  | 74000  | nan             | 1.0    |
| 0.0           | 2.8   | 76000  | nan             | 1.0    |
| 0.0           | 2.88  | 78000  | nan             | 1.0    |
| 0.0           | 2.95  | 80000  | nan             | 1.0    |
| 0.0           | 3.02  | 82000  | nan             | 1.0    |
| 0.0           | 3.1   | 84000  | nan             | 1.0    |
| 0.0           | 3.17  | 86000  | nan             | 1.0    |
| 0.0           | 3.25  | 88000  | nan             | 1.0    |
| 0.0           | 3.32  | 90000  | nan             | 1.0    |
| 0.0           | 3.39  | 92000  | nan             | 1.0    |
| 0.0           | 3.47  | 94000  | nan             | 1.0    |
| 0.0           | 3.54  | 96000  | nan             | 1.0    |
| 0.0           | 3.61  | 98000  | nan             | 1.0    |
| 0.0           | 3.69  | 100000 | nan             | 1.0    |
| 0.0           | 3.76  | 102000 | nan             | 1.0    |
| 0.0           | 3.84  | 104000 | nan             | 1.0    |
| 0.0           | 3.91  | 106000 | nan             | 1.0    |
| 0.0           | 3.98  | 108000 | nan             | 1.0    |


### Framework versions

- Transformers 4.33.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3