Hikiyo commited on
Commit
3e3232e
1 Parent(s): 6434340

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +408 -13
README.md CHANGED
@@ -1,20 +1,415 @@
1
  ---
2
- library_name: peft
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  ## Training procedure
5
 
 
6
 
7
- The following `bitsandbytes` quantization config was used during training:
8
- - load_in_8bit: False
9
- - load_in_4bit: True
10
- - llm_int8_threshold: 6.0
11
- - llm_int8_skip_modules: None
12
- - llm_int8_enable_fp32_cpu_offload: False
13
- - llm_int8_has_fp16_weight: False
14
- - bnb_4bit_quant_type: nf4
15
- - bnb_4bit_use_double_quant: False
16
- - bnb_4bit_compute_dtype: float16
17
- ### Framework versions
18
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
 
20
- - PEFT 0.4.0
 
 
 
 
1
  ---
2
+ license: mit
3
+ base_model: vicgalle/gpt2-alpaca-gpt4
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: gpt2-alpaca-pandalm
8
+ results: []
9
  ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # gpt2-alpaca-pandalm
15
+
16
+ This model is a fine-tuned version of [vicgalle/gpt2-alpaca-gpt4](https://huggingface.co/vicgalle/gpt2-alpaca-gpt4) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8219
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
  ## Training procedure
33
 
34
+ ### Training hyperparameters
35
 
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 0.0002
38
+ - train_batch_size: 4
39
+ - eval_batch_size: 8
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: cosine
43
+ - lr_scheduler_warmup_ratio: 0.03
44
+ - num_epochs: 1
45
+
46
+ ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-----:|:-----:|:---------------:|
50
+ | No log | 0.0 | 200 | 2.2796 |
51
+ | No log | 0.01 | 400 | 1.7930 |
52
+ | No log | 0.01 | 600 | 1.2870 |
53
+ | No log | 0.01 | 800 | 1.1460 |
54
+ | No log | 0.01 | 1000 | 1.0742 |
55
+ | No log | 0.02 | 1200 | 1.0431 |
56
+ | No log | 0.02 | 1400 | 1.0250 |
57
+ | No log | 0.02 | 1600 | 1.0108 |
58
+ | No log | 0.03 | 1800 | 0.9998 |
59
+ | No log | 0.03 | 2000 | 0.9937 |
60
+ | No log | 0.03 | 2200 | 0.9791 |
61
+ | No log | 0.03 | 2400 | 0.9817 |
62
+ | No log | 0.04 | 2600 | 0.9617 |
63
+ | No log | 0.04 | 2800 | 0.9580 |
64
+ | 1.2199 | 0.04 | 3000 | 0.9757 |
65
+ | 1.2199 | 0.04 | 3200 | 0.9541 |
66
+ | 1.2199 | 0.05 | 3400 | 0.9548 |
67
+ | 1.2199 | 0.05 | 3600 | 0.9485 |
68
+ | 1.2199 | 0.05 | 3800 | 0.9395 |
69
+ | 1.2199 | 0.06 | 4000 | 0.9413 |
70
+ | 1.2199 | 0.06 | 4200 | 0.9336 |
71
+ | 1.2199 | 0.06 | 4400 | 0.9369 |
72
+ | 1.2199 | 0.06 | 4600 | 0.9346 |
73
+ | 1.2199 | 0.07 | 4800 | 0.9277 |
74
+ | 1.2199 | 0.07 | 5000 | 0.9255 |
75
+ | 1.2199 | 0.07 | 5200 | 0.9253 |
76
+ | 1.2199 | 0.08 | 5400 | 0.9152 |
77
+ | 1.2199 | 0.08 | 5600 | 0.9203 |
78
+ | 1.2199 | 0.08 | 5800 | 0.9244 |
79
+ | 0.9222 | 0.08 | 6000 | 0.9178 |
80
+ | 0.9222 | 0.09 | 6200 | 0.9230 |
81
+ | 0.9222 | 0.09 | 6400 | 0.9109 |
82
+ | 0.9222 | 0.09 | 6600 | 0.9132 |
83
+ | 0.9222 | 0.09 | 6800 | 0.9159 |
84
+ | 0.9222 | 0.1 | 7000 | 0.9090 |
85
+ | 0.9222 | 0.1 | 7200 | 0.9073 |
86
+ | 0.9222 | 0.1 | 7400 | 0.9115 |
87
+ | 0.9222 | 0.11 | 7600 | 0.9125 |
88
+ | 0.9222 | 0.11 | 7800 | 0.9087 |
89
+ | 0.9222 | 0.11 | 8000 | 0.9103 |
90
+ | 0.9222 | 0.11 | 8200 | 0.9061 |
91
+ | 0.9222 | 0.12 | 8400 | 0.9047 |
92
+ | 0.9222 | 0.12 | 8600 | 0.9025 |
93
+ | 0.9222 | 0.12 | 8800 | 0.9023 |
94
+ | 0.8883 | 0.13 | 9000 | 0.8949 |
95
+ | 0.8883 | 0.13 | 9200 | 0.8939 |
96
+ | 0.8883 | 0.13 | 9400 | 0.8942 |
97
+ | 0.8883 | 0.13 | 9600 | 0.8993 |
98
+ | 0.8883 | 0.14 | 9800 | 0.8925 |
99
+ | 0.8883 | 0.14 | 10000 | 0.8891 |
100
+ | 0.8883 | 0.14 | 10200 | 0.8874 |
101
+ | 0.8883 | 0.15 | 10400 | 0.8941 |
102
+ | 0.8883 | 0.15 | 10600 | 0.8905 |
103
+ | 0.8883 | 0.15 | 10800 | 0.8863 |
104
+ | 0.8883 | 0.15 | 11000 | 0.8916 |
105
+ | 0.8883 | 0.16 | 11200 | 0.8902 |
106
+ | 0.8883 | 0.16 | 11400 | 0.8851 |
107
+ | 0.8883 | 0.16 | 11600 | 0.8832 |
108
+ | 0.8883 | 0.16 | 11800 | 0.8824 |
109
+ | 0.8719 | 0.17 | 12000 | 0.8793 |
110
+ | 0.8719 | 0.17 | 12200 | 0.8797 |
111
+ | 0.8719 | 0.17 | 12400 | 0.8810 |
112
+ | 0.8719 | 0.18 | 12600 | 0.8796 |
113
+ | 0.8719 | 0.18 | 12800 | 0.8749 |
114
+ | 0.8719 | 0.18 | 13000 | 0.8740 |
115
+ | 0.8719 | 0.18 | 13200 | 0.8757 |
116
+ | 0.8719 | 0.19 | 13400 | 0.8767 |
117
+ | 0.8719 | 0.19 | 13600 | 0.8778 |
118
+ | 0.8719 | 0.19 | 13800 | 0.8793 |
119
+ | 0.8719 | 0.2 | 14000 | 0.8776 |
120
+ | 0.8719 | 0.2 | 14200 | 0.8740 |
121
+ | 0.8719 | 0.2 | 14400 | 0.8731 |
122
+ | 0.8719 | 0.2 | 14600 | 0.8729 |
123
+ | 0.8719 | 0.21 | 14800 | 0.8733 |
124
+ | 0.8605 | 0.21 | 15000 | 0.8739 |
125
+ | 0.8605 | 0.21 | 15200 | 0.8669 |
126
+ | 0.8605 | 0.21 | 15400 | 0.8629 |
127
+ | 0.8605 | 0.22 | 15600 | 0.8673 |
128
+ | 0.8605 | 0.22 | 15800 | 0.8653 |
129
+ | 0.8605 | 0.22 | 16000 | 0.8703 |
130
+ | 0.8605 | 0.23 | 16200 | 0.8685 |
131
+ | 0.8605 | 0.23 | 16400 | 0.8693 |
132
+ | 0.8605 | 0.23 | 16600 | 0.8684 |
133
+ | 0.8605 | 0.23 | 16800 | 0.8629 |
134
+ | 0.8605 | 0.24 | 17000 | 0.8643 |
135
+ | 0.8605 | 0.24 | 17200 | 0.8625 |
136
+ | 0.8605 | 0.24 | 17400 | 0.8604 |
137
+ | 0.8605 | 0.25 | 17600 | 0.8599 |
138
+ | 0.8605 | 0.25 | 17800 | 0.8617 |
139
+ | 0.8537 | 0.25 | 18000 | 0.8618 |
140
+ | 0.8537 | 0.25 | 18200 | 0.8608 |
141
+ | 0.8537 | 0.26 | 18400 | 0.8626 |
142
+ | 0.8537 | 0.26 | 18600 | 0.8607 |
143
+ | 0.8537 | 0.26 | 18800 | 0.8577 |
144
+ | 0.8537 | 0.26 | 19000 | 0.8584 |
145
+ | 0.8537 | 0.27 | 19200 | 0.8597 |
146
+ | 0.8537 | 0.27 | 19400 | 0.8561 |
147
+ | 0.8537 | 0.27 | 19600 | 0.8578 |
148
+ | 0.8537 | 0.28 | 19800 | 0.8545 |
149
+ | 0.8537 | 0.28 | 20000 | 0.8539 |
150
+ | 0.8537 | 0.28 | 20200 | 0.8578 |
151
+ | 0.8537 | 0.28 | 20400 | 0.8536 |
152
+ | 0.8537 | 0.29 | 20600 | 0.8527 |
153
+ | 0.8537 | 0.29 | 20800 | 0.8551 |
154
+ | 0.8472 | 0.29 | 21000 | 0.8542 |
155
+ | 0.8472 | 0.3 | 21200 | 0.8547 |
156
+ | 0.8472 | 0.3 | 21400 | 0.8528 |
157
+ | 0.8472 | 0.3 | 21600 | 0.8540 |
158
+ | 0.8472 | 0.3 | 21800 | 0.8503 |
159
+ | 0.8472 | 0.31 | 22000 | 0.8498 |
160
+ | 0.8472 | 0.31 | 22200 | 0.8502 |
161
+ | 0.8472 | 0.31 | 22400 | 0.8522 |
162
+ | 0.8472 | 0.32 | 22600 | 0.8499 |
163
+ | 0.8472 | 0.32 | 22800 | 0.8511 |
164
+ | 0.8472 | 0.32 | 23000 | 0.8503 |
165
+ | 0.8472 | 0.32 | 23200 | 0.8498 |
166
+ | 0.8472 | 0.33 | 23400 | 0.8463 |
167
+ | 0.8472 | 0.33 | 23600 | 0.8488 |
168
+ | 0.8472 | 0.33 | 23800 | 0.8510 |
169
+ | 0.8421 | 0.33 | 24000 | 0.8479 |
170
+ | 0.8421 | 0.34 | 24200 | 0.8486 |
171
+ | 0.8421 | 0.34 | 24400 | 0.8485 |
172
+ | 0.8421 | 0.34 | 24600 | 0.8484 |
173
+ | 0.8421 | 0.35 | 24800 | 0.8495 |
174
+ | 0.8421 | 0.35 | 25000 | 0.8475 |
175
+ | 0.8421 | 0.35 | 25200 | 0.8484 |
176
+ | 0.8421 | 0.35 | 25400 | 0.8479 |
177
+ | 0.8421 | 0.36 | 25600 | 0.8479 |
178
+ | 0.8421 | 0.36 | 25800 | 0.8452 |
179
+ | 0.8421 | 0.36 | 26000 | 0.8481 |
180
+ | 0.8421 | 0.37 | 26200 | 0.8479 |
181
+ | 0.8421 | 0.37 | 26400 | 0.8442 |
182
+ | 0.8421 | 0.37 | 26600 | 0.8441 |
183
+ | 0.8421 | 0.37 | 26800 | 0.8440 |
184
+ | 0.8377 | 0.38 | 27000 | 0.8412 |
185
+ | 0.8377 | 0.38 | 27200 | 0.8421 |
186
+ | 0.8377 | 0.38 | 27400 | 0.8432 |
187
+ | 0.8377 | 0.38 | 27600 | 0.8425 |
188
+ | 0.8377 | 0.39 | 27800 | 0.8432 |
189
+ | 0.8377 | 0.39 | 28000 | 0.8432 |
190
+ | 0.8377 | 0.39 | 28200 | 0.8421 |
191
+ | 0.8377 | 0.4 | 28400 | 0.8435 |
192
+ | 0.8377 | 0.4 | 28600 | 0.8432 |
193
+ | 0.8377 | 0.4 | 28800 | 0.8417 |
194
+ | 0.8377 | 0.4 | 29000 | 0.8403 |
195
+ | 0.8377 | 0.41 | 29200 | 0.8444 |
196
+ | 0.8377 | 0.41 | 29400 | 0.8425 |
197
+ | 0.8377 | 0.41 | 29600 | 0.8422 |
198
+ | 0.8377 | 0.42 | 29800 | 0.8438 |
199
+ | 0.8291 | 0.42 | 30000 | 0.8399 |
200
+ | 0.8291 | 0.42 | 30200 | 0.8450 |
201
+ | 0.8291 | 0.42 | 30400 | 0.8421 |
202
+ | 0.8291 | 0.43 | 30600 | 0.8402 |
203
+ | 0.8291 | 0.43 | 30800 | 0.8441 |
204
+ | 0.8291 | 0.43 | 31000 | 0.8418 |
205
+ | 0.8291 | 0.44 | 31200 | 0.8422 |
206
+ | 0.8291 | 0.44 | 31400 | 0.8376 |
207
+ | 0.8291 | 0.44 | 31600 | 0.8386 |
208
+ | 0.8291 | 0.44 | 31800 | 0.8412 |
209
+ | 0.8291 | 0.45 | 32000 | 0.8447 |
210
+ | 0.8291 | 0.45 | 32200 | 0.8428 |
211
+ | 0.8291 | 0.45 | 32400 | 0.8409 |
212
+ | 0.8291 | 0.45 | 32600 | 0.8375 |
213
+ | 0.8291 | 0.46 | 32800 | 0.8354 |
214
+ | 0.8279 | 0.46 | 33000 | 0.8360 |
215
+ | 0.8279 | 0.46 | 33200 | 0.8373 |
216
+ | 0.8279 | 0.47 | 33400 | 0.8372 |
217
+ | 0.8279 | 0.47 | 33600 | 0.8393 |
218
+ | 0.8279 | 0.47 | 33800 | 0.8363 |
219
+ | 0.8279 | 0.47 | 34000 | 0.8370 |
220
+ | 0.8279 | 0.48 | 34200 | 0.8359 |
221
+ | 0.8279 | 0.48 | 34400 | 0.8336 |
222
+ | 0.8279 | 0.48 | 34600 | 0.8334 |
223
+ | 0.8279 | 0.49 | 34800 | 0.8322 |
224
+ | 0.8279 | 0.49 | 35000 | 0.8326 |
225
+ | 0.8279 | 0.49 | 35200 | 0.8315 |
226
+ | 0.8279 | 0.49 | 35400 | 0.8354 |
227
+ | 0.8279 | 0.5 | 35600 | 0.8360 |
228
+ | 0.8279 | 0.5 | 35800 | 0.8321 |
229
+ | 0.8254 | 0.5 | 36000 | 0.8341 |
230
+ | 0.8254 | 0.5 | 36200 | 0.8350 |
231
+ | 0.8254 | 0.51 | 36400 | 0.8344 |
232
+ | 0.8254 | 0.51 | 36600 | 0.8335 |
233
+ | 0.8254 | 0.51 | 36800 | 0.8337 |
234
+ | 0.8254 | 0.52 | 37000 | 0.8305 |
235
+ | 0.8254 | 0.52 | 37200 | 0.8308 |
236
+ | 0.8254 | 0.52 | 37400 | 0.8319 |
237
+ | 0.8254 | 0.52 | 37600 | 0.8320 |
238
+ | 0.8254 | 0.53 | 37800 | 0.8292 |
239
+ | 0.8254 | 0.53 | 38000 | 0.8316 |
240
+ | 0.8254 | 0.53 | 38200 | 0.8329 |
241
+ | 0.8254 | 0.54 | 38400 | 0.8314 |
242
+ | 0.8254 | 0.54 | 38600 | 0.8301 |
243
+ | 0.8254 | 0.54 | 38800 | 0.8319 |
244
+ | 0.822 | 0.54 | 39000 | 0.8325 |
245
+ | 0.822 | 0.55 | 39200 | 0.8313 |
246
+ | 0.822 | 0.55 | 39400 | 0.8305 |
247
+ | 0.822 | 0.55 | 39600 | 0.8303 |
248
+ | 0.822 | 0.55 | 39800 | 0.8283 |
249
+ | 0.822 | 0.56 | 40000 | 0.8315 |
250
+ | 0.822 | 0.56 | 40200 | 0.8280 |
251
+ | 0.822 | 0.56 | 40400 | 0.8316 |
252
+ | 0.822 | 0.57 | 40600 | 0.8303 |
253
+ | 0.822 | 0.57 | 40800 | 0.8317 |
254
+ | 0.822 | 0.57 | 41000 | 0.8302 |
255
+ | 0.822 | 0.57 | 41200 | 0.8298 |
256
+ | 0.822 | 0.58 | 41400 | 0.8313 |
257
+ | 0.822 | 0.58 | 41600 | 0.8304 |
258
+ | 0.822 | 0.58 | 41800 | 0.8289 |
259
+ | 0.819 | 0.59 | 42000 | 0.8293 |
260
+ | 0.819 | 0.59 | 42200 | 0.8315 |
261
+ | 0.819 | 0.59 | 42400 | 0.8250 |
262
+ | 0.819 | 0.59 | 42600 | 0.8264 |
263
+ | 0.819 | 0.6 | 42800 | 0.8282 |
264
+ | 0.819 | 0.6 | 43000 | 0.8290 |
265
+ | 0.819 | 0.6 | 43200 | 0.8283 |
266
+ | 0.819 | 0.61 | 43400 | 0.8291 |
267
+ | 0.819 | 0.61 | 43600 | 0.8271 |
268
+ | 0.819 | 0.61 | 43800 | 0.8261 |
269
+ | 0.819 | 0.61 | 44000 | 0.8276 |
270
+ | 0.819 | 0.62 | 44200 | 0.8274 |
271
+ | 0.819 | 0.62 | 44400 | 0.8279 |
272
+ | 0.819 | 0.62 | 44600 | 0.8264 |
273
+ | 0.819 | 0.62 | 44800 | 0.8275 |
274
+ | 0.8203 | 0.63 | 45000 | 0.8266 |
275
+ | 0.8203 | 0.63 | 45200 | 0.8259 |
276
+ | 0.8203 | 0.63 | 45400 | 0.8277 |
277
+ | 0.8203 | 0.64 | 45600 | 0.8269 |
278
+ | 0.8203 | 0.64 | 45800 | 0.8261 |
279
+ | 0.8203 | 0.64 | 46000 | 0.8245 |
280
+ | 0.8203 | 0.64 | 46200 | 0.8243 |
281
+ | 0.8203 | 0.65 | 46400 | 0.8242 |
282
+ | 0.8203 | 0.65 | 46600 | 0.8244 |
283
+ | 0.8203 | 0.65 | 46800 | 0.8237 |
284
+ | 0.8203 | 0.66 | 47000 | 0.8247 |
285
+ | 0.8203 | 0.66 | 47200 | 0.8238 |
286
+ | 0.8203 | 0.66 | 47400 | 0.8239 |
287
+ | 0.8203 | 0.66 | 47600 | 0.8252 |
288
+ | 0.8203 | 0.67 | 47800 | 0.8267 |
289
+ | 0.8169 | 0.67 | 48000 | 0.8245 |
290
+ | 0.8169 | 0.67 | 48200 | 0.8251 |
291
+ | 0.8169 | 0.67 | 48400 | 0.8247 |
292
+ | 0.8169 | 0.68 | 48600 | 0.8252 |
293
+ | 0.8169 | 0.68 | 48800 | 0.8259 |
294
+ | 0.8169 | 0.68 | 49000 | 0.8244 |
295
+ | 0.8169 | 0.69 | 49200 | 0.8245 |
296
+ | 0.8169 | 0.69 | 49400 | 0.8260 |
297
+ | 0.8169 | 0.69 | 49600 | 0.8265 |
298
+ | 0.8169 | 0.69 | 49800 | 0.8258 |
299
+ | 0.8169 | 0.7 | 50000 | 0.8274 |
300
+ | 0.8169 | 0.7 | 50200 | 0.8287 |
301
+ | 0.8169 | 0.7 | 50400 | 0.8280 |
302
+ | 0.8169 | 0.71 | 50600 | 0.8266 |
303
+ | 0.8169 | 0.71 | 50800 | 0.8259 |
304
+ | 0.8153 | 0.71 | 51000 | 0.8263 |
305
+ | 0.8153 | 0.71 | 51200 | 0.8260 |
306
+ | 0.8153 | 0.72 | 51400 | 0.8258 |
307
+ | 0.8153 | 0.72 | 51600 | 0.8251 |
308
+ | 0.8153 | 0.72 | 51800 | 0.8250 |
309
+ | 0.8153 | 0.73 | 52000 | 0.8254 |
310
+ | 0.8153 | 0.73 | 52200 | 0.8244 |
311
+ | 0.8153 | 0.73 | 52400 | 0.8236 |
312
+ | 0.8153 | 0.73 | 52600 | 0.8234 |
313
+ | 0.8153 | 0.74 | 52800 | 0.8251 |
314
+ | 0.8153 | 0.74 | 53000 | 0.8246 |
315
+ | 0.8153 | 0.74 | 53200 | 0.8248 |
316
+ | 0.8153 | 0.74 | 53400 | 0.8236 |
317
+ | 0.8153 | 0.75 | 53600 | 0.8243 |
318
+ | 0.8153 | 0.75 | 53800 | 0.8255 |
319
+ | 0.8123 | 0.75 | 54000 | 0.8246 |
320
+ | 0.8123 | 0.76 | 54200 | 0.8235 |
321
+ | 0.8123 | 0.76 | 54400 | 0.8235 |
322
+ | 0.8123 | 0.76 | 54600 | 0.8235 |
323
+ | 0.8123 | 0.76 | 54800 | 0.8238 |
324
+ | 0.8123 | 0.77 | 55000 | 0.8242 |
325
+ | 0.8123 | 0.77 | 55200 | 0.8233 |
326
+ | 0.8123 | 0.77 | 55400 | 0.8236 |
327
+ | 0.8123 | 0.78 | 55600 | 0.8226 |
328
+ | 0.8123 | 0.78 | 55800 | 0.8225 |
329
+ | 0.8123 | 0.78 | 56000 | 0.8220 |
330
+ | 0.8123 | 0.78 | 56200 | 0.8228 |
331
+ | 0.8123 | 0.79 | 56400 | 0.8230 |
332
+ | 0.8123 | 0.79 | 56600 | 0.8226 |
333
+ | 0.8123 | 0.79 | 56800 | 0.8223 |
334
+ | 0.8106 | 0.79 | 57000 | 0.8229 |
335
+ | 0.8106 | 0.8 | 57200 | 0.8225 |
336
+ | 0.8106 | 0.8 | 57400 | 0.8229 |
337
+ | 0.8106 | 0.8 | 57600 | 0.8230 |
338
+ | 0.8106 | 0.81 | 57800 | 0.8234 |
339
+ | 0.8106 | 0.81 | 58000 | 0.8230 |
340
+ | 0.8106 | 0.81 | 58200 | 0.8231 |
341
+ | 0.8106 | 0.81 | 58400 | 0.8227 |
342
+ | 0.8106 | 0.82 | 58600 | 0.8227 |
343
+ | 0.8106 | 0.82 | 58800 | 0.8213 |
344
+ | 0.8106 | 0.82 | 59000 | 0.8209 |
345
+ | 0.8106 | 0.83 | 59200 | 0.8213 |
346
+ | 0.8106 | 0.83 | 59400 | 0.8214 |
347
+ | 0.8106 | 0.83 | 59600 | 0.8219 |
348
+ | 0.8106 | 0.83 | 59800 | 0.8220 |
349
+ | 0.813 | 0.84 | 60000 | 0.8214 |
350
+ | 0.813 | 0.84 | 60200 | 0.8217 |
351
+ | 0.813 | 0.84 | 60400 | 0.8217 |
352
+ | 0.813 | 0.85 | 60600 | 0.8221 |
353
+ | 0.813 | 0.85 | 60800 | 0.8227 |
354
+ | 0.813 | 0.85 | 61000 | 0.8225 |
355
+ | 0.813 | 0.85 | 61200 | 0.8226 |
356
+ | 0.813 | 0.86 | 61400 | 0.8218 |
357
+ | 0.813 | 0.86 | 61600 | 0.8223 |
358
+ | 0.813 | 0.86 | 61800 | 0.8229 |
359
+ | 0.813 | 0.86 | 62000 | 0.8224 |
360
+ | 0.813 | 0.87 | 62200 | 0.8222 |
361
+ | 0.813 | 0.87 | 62400 | 0.8222 |
362
+ | 0.813 | 0.87 | 62600 | 0.8224 |
363
+ | 0.813 | 0.88 | 62800 | 0.8224 |
364
+ | 0.8073 | 0.88 | 63000 | 0.8226 |
365
+ | 0.8073 | 0.88 | 63200 | 0.8222 |
366
+ | 0.8073 | 0.88 | 63400 | 0.8219 |
367
+ | 0.8073 | 0.89 | 63600 | 0.8216 |
368
+ | 0.8073 | 0.89 | 63800 | 0.8214 |
369
+ | 0.8073 | 0.89 | 64000 | 0.8212 |
370
+ | 0.8073 | 0.9 | 64200 | 0.8214 |
371
+ | 0.8073 | 0.9 | 64400 | 0.8216 |
372
+ | 0.8073 | 0.9 | 64600 | 0.8217 |
373
+ | 0.8073 | 0.9 | 64800 | 0.8219 |
374
+ | 0.8073 | 0.91 | 65000 | 0.8217 |
375
+ | 0.8073 | 0.91 | 65200 | 0.8217 |
376
+ | 0.8073 | 0.91 | 65400 | 0.8217 |
377
+ | 0.8073 | 0.91 | 65600 | 0.8219 |
378
+ | 0.8073 | 0.92 | 65800 | 0.8219 |
379
+ | 0.8095 | 0.92 | 66000 | 0.8217 |
380
+ | 0.8095 | 0.92 | 66200 | 0.8218 |
381
+ | 0.8095 | 0.93 | 66400 | 0.8218 |
382
+ | 0.8095 | 0.93 | 66600 | 0.8217 |
383
+ | 0.8095 | 0.93 | 66800 | 0.8217 |
384
+ | 0.8095 | 0.93 | 67000 | 0.8216 |
385
+ | 0.8095 | 0.94 | 67200 | 0.8217 |
386
+ | 0.8095 | 0.94 | 67400 | 0.8218 |
387
+ | 0.8095 | 0.94 | 67600 | 0.8218 |
388
+ | 0.8095 | 0.95 | 67800 | 0.8218 |
389
+ | 0.8095 | 0.95 | 68000 | 0.8217 |
390
+ | 0.8095 | 0.95 | 68200 | 0.8218 |
391
+ | 0.8095 | 0.95 | 68400 | 0.8218 |
392
+ | 0.8095 | 0.96 | 68600 | 0.8219 |
393
+ | 0.8095 | 0.96 | 68800 | 0.8219 |
394
+ | 0.8086 | 0.96 | 69000 | 0.8218 |
395
+ | 0.8086 | 0.96 | 69200 | 0.8218 |
396
+ | 0.8086 | 0.97 | 69400 | 0.8219 |
397
+ | 0.8086 | 0.97 | 69600 | 0.8218 |
398
+ | 0.8086 | 0.97 | 69800 | 0.8219 |
399
+ | 0.8086 | 0.98 | 70000 | 0.8219 |
400
+ | 0.8086 | 0.98 | 70200 | 0.8219 |
401
+ | 0.8086 | 0.98 | 70400 | 0.8219 |
402
+ | 0.8086 | 0.98 | 70600 | 0.8219 |
403
+ | 0.8086 | 0.99 | 70800 | 0.8219 |
404
+ | 0.8086 | 0.99 | 71000 | 0.8219 |
405
+ | 0.8086 | 0.99 | 71200 | 0.8219 |
406
+ | 0.8086 | 1.0 | 71400 | 0.8219 |
407
+ | 0.8086 | 1.0 | 71600 | 0.8219 |
408
+
409
+
410
+ ### Framework versions
411
 
412
+ - Transformers 4.31.0
413
+ - Pytorch 2.1.2
414
+ - Datasets 2.18.0
415
+ - Tokenizers 0.13.3