utakumi commited on
Commit
ea90dce
1 Parent(s): f9deaf6

Model save

Browse files
Files changed (2) hide show
  1. README.md +268 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,268 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: rinna/japanese-hubert-base
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - common_voice_13_0
9
+ metrics:
10
+ - wer
11
+ model-index:
12
+ - name: Hubert-common_voice-ja-demo-roma-cosine-3e-4
13
+ results:
14
+ - task:
15
+ name: Automatic Speech Recognition
16
+ type: automatic-speech-recognition
17
+ dataset:
18
+ name: common_voice_13_0
19
+ type: common_voice_13_0
20
+ config: ja
21
+ split: test
22
+ args: ja
23
+ metrics:
24
+ - name: Wer
25
+ type: wer
26
+ value: 0.9977818108489614
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # Hubert-common_voice-ja-demo-roma-cosine-3e-4
33
+
34
+ This model is a fine-tuned version of [rinna/japanese-hubert-base](https://huggingface.co/rinna/japanese-hubert-base) on the common_voice_13_0 dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.4271
37
+ - Wer: 0.9978
38
+ - Cer: 0.1642
39
+
40
+ ## Model description
41
+
42
+ More information needed
43
+
44
+ ## Intended uses & limitations
45
+
46
+ More information needed
47
+
48
+ ## Training and evaluation data
49
+
50
+ More information needed
51
+
52
+ ## Training procedure
53
+
54
+ ### Training hyperparameters
55
+
56
+ The following hyperparameters were used during training:
57
+ - learning_rate: 0.0003
58
+ - train_batch_size: 16
59
+ - eval_batch_size: 8
60
+ - seed: 42
61
+ - gradient_accumulation_steps: 2
62
+ - total_train_batch_size: 32
63
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
64
+ - lr_scheduler_type: cosine
65
+ - lr_scheduler_warmup_steps: 12500
66
+ - num_epochs: 50.0
67
+ - mixed_precision_training: Native AMP
68
+
69
+ ### Training results
70
+
71
+ | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
72
+ |:-------------:|:-------:|:-----:|:---------------:|:------:|:------:|
73
+ | No log | 0.2660 | 100 | 15.8236 | 1.7925 | 1.6163 |
74
+ | No log | 0.5319 | 200 | 6.4207 | 1.0 | 0.9276 |
75
+ | No log | 0.7979 | 300 | 5.4193 | 1.0 | 0.9276 |
76
+ | No log | 1.0638 | 400 | 4.9073 | 1.0 | 0.9276 |
77
+ | 7.456 | 1.3298 | 500 | 4.3586 | 1.0 | 0.9276 |
78
+ | 7.456 | 1.5957 | 600 | 3.8341 | 1.0 | 0.9276 |
79
+ | 7.456 | 1.8617 | 700 | 3.4148 | 1.0 | 0.9276 |
80
+ | 7.456 | 2.1277 | 800 | 3.1047 | 1.0 | 0.9276 |
81
+ | 7.456 | 2.3936 | 900 | 2.9612 | 1.0 | 0.9276 |
82
+ | 3.2563 | 2.6596 | 1000 | 2.9027 | 1.0 | 0.9276 |
83
+ | 3.2563 | 2.9255 | 1100 | 2.8767 | 1.0 | 0.9276 |
84
+ | 3.2563 | 3.1915 | 1200 | 2.8588 | 1.0 | 0.9276 |
85
+ | 3.2563 | 3.4574 | 1300 | 2.8038 | 1.0 | 0.9276 |
86
+ | 3.2563 | 3.7234 | 1400 | 1.9933 | 1.0 | 0.8359 |
87
+ | 2.5032 | 3.9894 | 1500 | 1.1381 | 0.9998 | 0.3811 |
88
+ | 2.5032 | 4.2553 | 1600 | 0.8252 | 0.9984 | 0.2746 |
89
+ | 2.5032 | 4.5213 | 1700 | 0.6763 | 0.9974 | 0.2507 |
90
+ | 2.5032 | 4.7872 | 1800 | 0.6011 | 0.9980 | 0.2417 |
91
+ | 2.5032 | 5.0532 | 1900 | 0.5514 | 0.9976 | 0.2308 |
92
+ | 0.678 | 5.3191 | 2000 | 0.5129 | 0.9986 | 0.2249 |
93
+ | 0.678 | 5.5851 | 2100 | 0.4958 | 0.9944 | 0.2325 |
94
+ | 0.678 | 5.8511 | 2200 | 0.4717 | 0.9978 | 0.2195 |
95
+ | 0.678 | 6.1170 | 2300 | 0.4652 | 0.9992 | 0.2173 |
96
+ | 0.678 | 6.3830 | 2400 | 0.4555 | 0.9974 | 0.2122 |
97
+ | 0.4238 | 6.6489 | 2500 | 0.4361 | 0.9988 | 0.2116 |
98
+ | 0.4238 | 6.9149 | 2600 | 0.4181 | 0.9990 | 0.2063 |
99
+ | 0.4238 | 7.1809 | 2700 | 0.4186 | 0.9958 | 0.2012 |
100
+ | 0.4238 | 7.4468 | 2800 | 0.4254 | 0.9994 | 0.2035 |
101
+ | 0.4238 | 7.7128 | 2900 | 0.4014 | 0.9984 | 0.1981 |
102
+ | 0.3375 | 7.9787 | 3000 | 0.3877 | 0.9976 | 0.1980 |
103
+ | 0.3375 | 8.2447 | 3100 | 0.3868 | 0.9982 | 0.1926 |
104
+ | 0.3375 | 8.5106 | 3200 | 0.3740 | 0.9978 | 0.1903 |
105
+ | 0.3375 | 8.7766 | 3300 | 0.3645 | 0.9982 | 0.1855 |
106
+ | 0.3375 | 9.0426 | 3400 | 0.3586 | 0.9992 | 0.1824 |
107
+ | 0.2553 | 9.3085 | 3500 | 0.3328 | 0.9980 | 0.1775 |
108
+ | 0.2553 | 9.5745 | 3600 | 0.3401 | 0.9978 | 0.1744 |
109
+ | 0.2553 | 9.8404 | 3700 | 0.3124 | 0.9976 | 0.1727 |
110
+ | 0.2553 | 10.1064 | 3800 | 0.3225 | 0.9988 | 0.1709 |
111
+ | 0.2553 | 10.3723 | 3900 | 0.3311 | 0.9974 | 0.1760 |
112
+ | 0.2035 | 10.6383 | 4000 | 0.3098 | 0.9980 | 0.1705 |
113
+ | 0.2035 | 10.9043 | 4100 | 0.3244 | 0.9980 | 0.1714 |
114
+ | 0.2035 | 11.1702 | 4200 | 0.3280 | 0.9925 | 0.1686 |
115
+ | 0.2035 | 11.4362 | 4300 | 0.3134 | 0.9984 | 0.1705 |
116
+ | 0.2035 | 11.7021 | 4400 | 0.3025 | 0.9988 | 0.1667 |
117
+ | 0.1772 | 11.9681 | 4500 | 0.3156 | 0.9980 | 0.1690 |
118
+ | 0.1772 | 12.2340 | 4600 | 0.3213 | 0.9968 | 0.1657 |
119
+ | 0.1772 | 12.5 | 4700 | 0.3184 | 0.9976 | 0.1702 |
120
+ | 0.1772 | 12.7660 | 4800 | 0.3348 | 0.9990 | 0.1659 |
121
+ | 0.1772 | 13.0319 | 4900 | 0.3175 | 0.9978 | 0.1655 |
122
+ | 0.1542 | 13.2979 | 5000 | 0.3414 | 0.9998 | 0.1680 |
123
+ | 0.1542 | 13.5638 | 5100 | 0.3143 | 0.9994 | 0.1701 |
124
+ | 0.1542 | 13.8298 | 5200 | 0.3204 | 0.9986 | 0.1688 |
125
+ | 0.1542 | 14.0957 | 5300 | 0.3549 | 0.9990 | 0.1662 |
126
+ | 0.1542 | 14.3617 | 5400 | 0.4091 | 0.9974 | 0.1666 |
127
+ | 0.1449 | 14.6277 | 5500 | 0.3908 | 0.9986 | 0.1676 |
128
+ | 0.1449 | 14.8936 | 5600 | 0.3706 | 0.9984 | 0.1662 |
129
+ | 0.1449 | 15.1596 | 5700 | 0.3972 | 0.9972 | 0.1641 |
130
+ | 0.1449 | 15.4255 | 5800 | 0.3462 | 0.9984 | 0.1653 |
131
+ | 0.1449 | 15.6915 | 5900 | 0.3544 | 0.9984 | 0.1699 |
132
+ | 0.1396 | 15.9574 | 6000 | 0.3397 | 0.9988 | 0.1682 |
133
+ | 0.1396 | 16.2234 | 6100 | 0.3452 | 0.9984 | 0.1680 |
134
+ | 0.1396 | 16.4894 | 6200 | 0.3534 | 0.9982 | 0.1665 |
135
+ | 0.1396 | 16.7553 | 6300 | 0.3502 | 0.9986 | 0.1703 |
136
+ | 0.1396 | 17.0213 | 6400 | 0.3475 | 0.9978 | 0.1701 |
137
+ | 0.1293 | 17.2872 | 6500 | 0.3350 | 0.9988 | 0.1681 |
138
+ | 0.1293 | 17.5532 | 6600 | 0.3442 | 0.9978 | 0.1694 |
139
+ | 0.1293 | 17.8191 | 6700 | 0.3342 | 0.9988 | 0.1687 |
140
+ | 0.1293 | 18.0851 | 6800 | 0.3669 | 0.9986 | 0.1696 |
141
+ | 0.1293 | 18.3511 | 6900 | 0.3404 | 0.9970 | 0.1691 |
142
+ | 0.1276 | 18.6170 | 7000 | 0.3464 | 0.9990 | 0.1679 |
143
+ | 0.1276 | 18.8830 | 7100 | 0.3496 | 0.9984 | 0.1695 |
144
+ | 0.1276 | 19.1489 | 7200 | 0.3436 | 0.9968 | 0.1698 |
145
+ | 0.1276 | 19.4149 | 7300 | 0.3605 | 0.9954 | 0.1690 |
146
+ | 0.1276 | 19.6809 | 7400 | 0.3582 | 0.9974 | 0.1687 |
147
+ | 0.1264 | 19.9468 | 7500 | 0.3576 | 0.9982 | 0.1740 |
148
+ | 0.1264 | 20.2128 | 7600 | 0.3669 | 0.9986 | 0.1726 |
149
+ | 0.1264 | 20.4787 | 7700 | 0.3618 | 0.9980 | 0.1706 |
150
+ | 0.1264 | 20.7447 | 7800 | 0.3475 | 0.9990 | 0.1746 |
151
+ | 0.1264 | 21.0106 | 7900 | 0.3425 | 0.9972 | 0.1715 |
152
+ | 0.1219 | 21.2766 | 8000 | 0.3685 | 0.9984 | 0.1716 |
153
+ | 0.1219 | 21.5426 | 8100 | 0.3803 | 0.9990 | 0.1756 |
154
+ | 0.1219 | 21.8085 | 8200 | 0.3663 | 0.9984 | 0.1797 |
155
+ | 0.1219 | 22.0745 | 8300 | 0.3642 | 0.9978 | 0.1710 |
156
+ | 0.1219 | 22.3404 | 8400 | 0.3805 | 0.9988 | 0.1737 |
157
+ | 0.1177 | 22.6064 | 8500 | 0.3630 | 0.9986 | 0.1747 |
158
+ | 0.1177 | 22.8723 | 8600 | 0.4001 | 0.9974 | 0.1753 |
159
+ | 0.1177 | 23.1383 | 8700 | 0.3758 | 0.9978 | 0.1758 |
160
+ | 0.1177 | 23.4043 | 8800 | 0.3771 | 0.9984 | 0.1747 |
161
+ | 0.1177 | 23.6702 | 8900 | 0.4001 | 0.9984 | 0.1794 |
162
+ | 0.1241 | 23.9362 | 9000 | 0.3929 | 0.9998 | 0.1769 |
163
+ | 0.1241 | 24.2021 | 9100 | 0.3732 | 0.9992 | 0.1752 |
164
+ | 0.1241 | 24.4681 | 9200 | 0.3813 | 0.9984 | 0.1738 |
165
+ | 0.1241 | 24.7340 | 9300 | 0.4128 | 0.9990 | 0.1794 |
166
+ | 0.1241 | 25.0 | 9400 | 0.3756 | 0.9990 | 0.1751 |
167
+ | 0.121 | 25.2660 | 9500 | 0.3916 | 0.9990 | 0.1797 |
168
+ | 0.121 | 25.5319 | 9600 | 0.3882 | 0.9984 | 0.1824 |
169
+ | 0.121 | 25.7979 | 9700 | 0.3917 | 0.9976 | 0.1838 |
170
+ | 0.121 | 26.0638 | 9800 | 0.3928 | 0.9984 | 0.1770 |
171
+ | 0.121 | 26.3298 | 9900 | 0.3929 | 0.9996 | 0.1801 |
172
+ | 0.1206 | 26.5957 | 10000 | 0.3985 | 0.9988 | 0.1781 |
173
+ | 0.1206 | 26.8617 | 10100 | 0.3799 | 0.9994 | 0.1771 |
174
+ | 0.1206 | 27.1277 | 10200 | 0.4023 | 0.9994 | 0.1786 |
175
+ | 0.1206 | 27.3936 | 10300 | 0.4000 | 0.9992 | 0.1784 |
176
+ | 0.1206 | 27.6596 | 10400 | 0.3756 | 0.9976 | 0.1825 |
177
+ | 0.124 | 27.9255 | 10500 | 0.3971 | 0.9986 | 0.1779 |
178
+ | 0.124 | 28.1915 | 10600 | 0.4240 | 0.9996 | 0.1789 |
179
+ | 0.124 | 28.4574 | 10700 | 0.3718 | 0.9980 | 0.1792 |
180
+ | 0.124 | 28.7234 | 10800 | 0.4114 | 0.9986 | 0.1800 |
181
+ | 0.124 | 28.9894 | 10900 | 0.4174 | 0.9978 | 0.1800 |
182
+ | 0.122 | 29.2553 | 11000 | 0.4062 | 0.9988 | 0.1853 |
183
+ | 0.122 | 29.5213 | 11100 | 0.4203 | 0.9978 | 0.1861 |
184
+ | 0.122 | 29.7872 | 11200 | 0.4376 | 0.9986 | 0.1861 |
185
+ | 0.122 | 30.0532 | 11300 | 0.4094 | 0.9992 | 0.1812 |
186
+ | 0.122 | 30.3191 | 11400 | 0.4100 | 0.9988 | 0.1819 |
187
+ | 0.125 | 30.5851 | 11500 | 0.3997 | 0.9982 | 0.1869 |
188
+ | 0.125 | 30.8511 | 11600 | 0.4437 | 0.9990 | 0.1820 |
189
+ | 0.125 | 31.1170 | 11700 | 0.4423 | 0.9990 | 0.1858 |
190
+ | 0.125 | 31.3830 | 11800 | 0.4217 | 0.9988 | 0.1895 |
191
+ | 0.125 | 31.6489 | 11900 | 0.4612 | 0.9992 | 0.1966 |
192
+ | 0.1294 | 31.9149 | 12000 | 0.4386 | 0.9974 | 0.1862 |
193
+ | 0.1294 | 32.1809 | 12100 | 0.4278 | 0.9984 | 0.1892 |
194
+ | 0.1294 | 32.4468 | 12200 | 0.4187 | 0.9984 | 0.1856 |
195
+ | 0.1294 | 32.7128 | 12300 | 0.4047 | 0.9986 | 0.1829 |
196
+ | 0.1294 | 32.9787 | 12400 | 0.4231 | 0.9980 | 0.1852 |
197
+ | 0.1275 | 33.2447 | 12500 | 0.4124 | 0.9994 | 0.1843 |
198
+ | 0.1275 | 33.5106 | 12600 | 0.4191 | 0.9994 | 0.1870 |
199
+ | 0.1275 | 33.7766 | 12700 | 0.4846 | 0.9980 | 0.1927 |
200
+ | 0.1275 | 34.0426 | 12800 | 0.4212 | 0.9984 | 0.1845 |
201
+ | 0.1275 | 34.3085 | 12900 | 0.4326 | 0.9984 | 0.1837 |
202
+ | 0.134 | 34.5745 | 13000 | 0.4104 | 0.9992 | 0.1880 |
203
+ | 0.134 | 34.8404 | 13100 | 0.3965 | 0.9978 | 0.1877 |
204
+ | 0.134 | 35.1064 | 13200 | 0.4147 | 0.9994 | 0.1844 |
205
+ | 0.134 | 35.3723 | 13300 | 0.4251 | 0.9978 | 0.1859 |
206
+ | 0.134 | 35.6383 | 13400 | 0.4458 | 0.9994 | 0.1902 |
207
+ | 0.1293 | 35.9043 | 13500 | 0.4354 | 0.9992 | 0.1942 |
208
+ | 0.1293 | 36.1702 | 13600 | 0.4198 | 0.9996 | 0.1863 |
209
+ | 0.1293 | 36.4362 | 13700 | 0.4279 | 0.9986 | 0.1891 |
210
+ | 0.1293 | 36.7021 | 13800 | 0.4115 | 0.9976 | 0.1846 |
211
+ | 0.1293 | 36.9681 | 13900 | 0.4359 | 0.9986 | 0.1868 |
212
+ | 0.1193 | 37.2340 | 14000 | 0.4316 | 0.9992 | 0.1905 |
213
+ | 0.1193 | 37.5 | 14100 | 0.4389 | 0.9994 | 0.1899 |
214
+ | 0.1193 | 37.7660 | 14200 | 0.4215 | 0.9992 | 0.1825 |
215
+ | 0.1193 | 38.0319 | 14300 | 0.4793 | 0.9992 | 0.1890 |
216
+ | 0.1193 | 38.2979 | 14400 | 0.4382 | 0.9984 | 0.1854 |
217
+ | 0.1132 | 38.5638 | 14500 | 0.4011 | 0.9976 | 0.1824 |
218
+ | 0.1132 | 38.8298 | 14600 | 0.4283 | 0.9950 | 0.1798 |
219
+ | 0.1132 | 39.0957 | 14700 | 0.4304 | 0.9980 | 0.1803 |
220
+ | 0.1132 | 39.3617 | 14800 | 0.4049 | 0.9984 | 0.1811 |
221
+ | 0.1132 | 39.6277 | 14900 | 0.4146 | 0.9988 | 0.1785 |
222
+ | 0.0949 | 39.8936 | 15000 | 0.4499 | 0.9998 | 0.1817 |
223
+ | 0.0949 | 40.1596 | 15100 | 0.4205 | 0.9978 | 0.1791 |
224
+ | 0.0949 | 40.4255 | 15200 | 0.4419 | 0.9986 | 0.1807 |
225
+ | 0.0949 | 40.6915 | 15300 | 0.4283 | 0.9982 | 0.1801 |
226
+ | 0.0949 | 40.9574 | 15400 | 0.4327 | 0.9996 | 0.1769 |
227
+ | 0.0876 | 41.2234 | 15500 | 0.4488 | 0.9996 | 0.1781 |
228
+ | 0.0876 | 41.4894 | 15600 | 0.4194 | 0.9990 | 0.1736 |
229
+ | 0.0876 | 41.7553 | 15700 | 0.4320 | 0.9992 | 0.1754 |
230
+ | 0.0876 | 42.0213 | 15800 | 0.4347 | 0.9990 | 0.1729 |
231
+ | 0.0876 | 42.2872 | 15900 | 0.4819 | 0.9994 | 0.1744 |
232
+ | 0.0725 | 42.5532 | 16000 | 0.4491 | 0.9990 | 0.1752 |
233
+ | 0.0725 | 42.8191 | 16100 | 0.4537 | 0.9986 | 0.1741 |
234
+ | 0.0725 | 43.0851 | 16200 | 0.4588 | 0.9984 | 0.1717 |
235
+ | 0.0725 | 43.3511 | 16300 | 0.4417 | 0.9982 | 0.1715 |
236
+ | 0.0725 | 43.6170 | 16400 | 0.4554 | 0.9984 | 0.1729 |
237
+ | 0.0615 | 43.8830 | 16500 | 0.4464 | 0.9996 | 0.1719 |
238
+ | 0.0615 | 44.1489 | 16600 | 0.4726 | 0.9982 | 0.1718 |
239
+ | 0.0615 | 44.4149 | 16700 | 0.4456 | 0.9980 | 0.1704 |
240
+ | 0.0615 | 44.6809 | 16800 | 0.4247 | 0.9980 | 0.1693 |
241
+ | 0.0615 | 44.9468 | 16900 | 0.4499 | 0.9986 | 0.1684 |
242
+ | 0.0524 | 45.2128 | 17000 | 0.4610 | 0.9988 | 0.1667 |
243
+ | 0.0524 | 45.4787 | 17100 | 0.4252 | 0.9982 | 0.1675 |
244
+ | 0.0524 | 45.7447 | 17200 | 0.4185 | 0.9980 | 0.1670 |
245
+ | 0.0524 | 46.0106 | 17300 | 0.4377 | 0.9980 | 0.1665 |
246
+ | 0.0524 | 46.2766 | 17400 | 0.4387 | 0.9992 | 0.1666 |
247
+ | 0.0466 | 46.5426 | 17500 | 0.4388 | 0.9986 | 0.1659 |
248
+ | 0.0466 | 46.8085 | 17600 | 0.4408 | 0.9986 | 0.1654 |
249
+ | 0.0466 | 47.0745 | 17700 | 0.4277 | 0.9984 | 0.1651 |
250
+ | 0.0466 | 47.3404 | 17800 | 0.4244 | 0.9986 | 0.1650 |
251
+ | 0.0466 | 47.6064 | 17900 | 0.4296 | 0.9978 | 0.1644 |
252
+ | 0.0393 | 47.8723 | 18000 | 0.4341 | 0.9984 | 0.1648 |
253
+ | 0.0393 | 48.1383 | 18100 | 0.4337 | 0.9982 | 0.1646 |
254
+ | 0.0393 | 48.4043 | 18200 | 0.4331 | 0.9978 | 0.1642 |
255
+ | 0.0393 | 48.6702 | 18300 | 0.4281 | 0.9980 | 0.1641 |
256
+ | 0.0393 | 48.9362 | 18400 | 0.4268 | 0.9982 | 0.1641 |
257
+ | 0.0373 | 49.2021 | 18500 | 0.4275 | 0.9982 | 0.1641 |
258
+ | 0.0373 | 49.4681 | 18600 | 0.4269 | 0.9982 | 0.1641 |
259
+ | 0.0373 | 49.7340 | 18700 | 0.4268 | 0.9978 | 0.1640 |
260
+ | 0.0373 | 50.0 | 18800 | 0.4271 | 0.9978 | 0.1642 |
261
+
262
+
263
+ ### Framework versions
264
+
265
+ - Transformers 4.47.0.dev0
266
+ - Pytorch 2.5.1+cu124
267
+ - Datasets 3.1.0
268
+ - Tokenizers 0.20.3
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d50c9f8852b6bdfec8290087fa0e894d660f098362ff935a06e9c76f1199481b
3
  size 377607636
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bdff64db6176a414c79fa8188077685d282e2ddacc2d2d8830c56628984e02de
3
  size 377607636