End of training
Browse files
README.md
CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
15 |
|
16 |
This model is a fine-tuned version of [makhataei/qa-persian-albert-fa-zwnj-base-v2](https://huggingface.co/makhataei/qa-persian-albert-fa-zwnj-base-v2) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Loss: 6.
|
19 |
|
20 |
## Model description
|
21 |
|
@@ -34,7 +34,7 @@ More information needed
|
|
34 |
### Training hyperparameters
|
35 |
|
36 |
The following hyperparameters were used during training:
|
37 |
-
- learning_rate:
|
38 |
- train_batch_size: 14
|
39 |
- eval_batch_size: 14
|
40 |
- seed: 42
|
@@ -46,106 +46,106 @@ The following hyperparameters were used during training:
|
|
46 |
|
47 |
| Training Loss | Epoch | Step | Validation Loss |
|
48 |
|:-------------:|:-----:|:----:|:---------------:|
|
49 |
-
| 6.
|
50 |
-
| 6.
|
51 |
-
| 6.
|
52 |
-
| 6.
|
53 |
-
| 6.
|
54 |
-
| 6.
|
55 |
-
| 6.
|
56 |
-
| 6.
|
57 |
-
| 6.
|
58 |
-
| 6.
|
59 |
-
| 6.
|
60 |
-
| 6.
|
61 |
-
| 6.
|
62 |
-
| 6.
|
63 |
-
| 6.
|
64 |
-
| 6.
|
65 |
-
| 6.
|
66 |
-
| 6.
|
67 |
-
| 6.
|
68 |
-
| 6.
|
69 |
-
| 6.
|
70 |
-
| 6.
|
71 |
-
| 6.
|
72 |
-
| 6.
|
73 |
-
| 6.
|
74 |
-
| 6.
|
75 |
-
| 6.
|
76 |
-
| 6.
|
77 |
-
| 6.
|
78 |
-
| 6.
|
79 |
-
| 6.
|
80 |
-
| 6.
|
81 |
-
| 6.
|
82 |
-
| 6.
|
83 |
-
| 6.
|
84 |
-
| 6.
|
85 |
-
| 6.
|
86 |
-
| 6.
|
87 |
-
| 6.
|
88 |
-
| 6.
|
89 |
-
| 6.
|
90 |
-
| 6.
|
91 |
-
| 6.
|
92 |
-
| 6.
|
93 |
-
| 6.
|
94 |
-
| 6.
|
95 |
-
| 6.
|
96 |
-
| 6.
|
97 |
-
| 6.
|
98 |
-
| 6.
|
99 |
-
| 6.
|
100 |
-
| 6.
|
101 |
-
| 6.
|
102 |
-
| 6.
|
103 |
-
| 6.
|
104 |
-
| 6.
|
105 |
-
| 6.
|
106 |
-
| 6.
|
107 |
-
| 6.
|
108 |
-
| 6.
|
109 |
-
| 6.
|
110 |
-
| 6.
|
111 |
-
| 6.
|
112 |
-
| 6.
|
113 |
-
| 6.
|
114 |
-
| 6.
|
115 |
-
| 6.
|
116 |
-
| 6.
|
117 |
-
| 6.
|
118 |
-
| 6.
|
119 |
-
| 6.
|
120 |
-
| 6.
|
121 |
-
| 6.
|
122 |
-
| 6.
|
123 |
-
| 6.
|
124 |
-
| 6.
|
125 |
-
| 6.
|
126 |
-
| 6.
|
127 |
-
| 6.
|
128 |
-
| 6.
|
129 |
-
| 6.
|
130 |
-
| 6.
|
131 |
-
| 6.
|
132 |
-
| 6.
|
133 |
-
| 6.
|
134 |
-
| 6.
|
135 |
-
| 6.
|
136 |
-
| 6.
|
137 |
-
| 6.
|
138 |
-
| 6.
|
139 |
-
| 6.
|
140 |
-
| 6.
|
141 |
-
| 6.
|
142 |
-
| 6.
|
143 |
-
| 6.
|
144 |
-
| 6.
|
145 |
-
| 6.
|
146 |
-
| 6.
|
147 |
-
| 6.
|
148 |
-
| 6.
|
149 |
|
150 |
|
151 |
### Framework versions
|
|
|
15 |
|
16 |
This model is a fine-tuned version of [makhataei/qa-persian-albert-fa-zwnj-base-v2](https://huggingface.co/makhataei/qa-persian-albert-fa-zwnj-base-v2) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 6.0325
|
19 |
|
20 |
## Model description
|
21 |
|
|
|
34 |
### Training hyperparameters
|
35 |
|
36 |
The following hyperparameters were used during training:
|
37 |
+
- learning_rate: 6.25e-09
|
38 |
- train_batch_size: 14
|
39 |
- eval_batch_size: 14
|
40 |
- seed: 42
|
|
|
46 |
|
47 |
| Training Loss | Epoch | Step | Validation Loss |
|
48 |
|:-------------:|:-----:|:----:|:---------------:|
|
49 |
+
| 6.1973 | 1.0 | 9 | 6.1322 |
|
50 |
+
| 6.3268 | 2.0 | 18 | 6.1300 |
|
51 |
+
| 6.2845 | 3.0 | 27 | 6.1279 |
|
52 |
+
| 6.3965 | 4.0 | 36 | 6.1258 |
|
53 |
+
| 6.2567 | 5.0 | 45 | 6.1237 |
|
54 |
+
| 6.1612 | 6.0 | 54 | 6.1216 |
|
55 |
+
| 6.3347 | 7.0 | 63 | 6.1196 |
|
56 |
+
| 6.2435 | 8.0 | 72 | 6.1176 |
|
57 |
+
| 6.0989 | 9.0 | 81 | 6.1156 |
|
58 |
+
| 6.2013 | 10.0 | 90 | 6.1136 |
|
59 |
+
| 6.1531 | 11.0 | 99 | 6.1117 |
|
60 |
+
| 6.1447 | 12.0 | 108 | 6.1097 |
|
61 |
+
| 6.2328 | 13.0 | 117 | 6.1079 |
|
62 |
+
| 6.306 | 14.0 | 126 | 6.1060 |
|
63 |
+
| 6.2222 | 15.0 | 135 | 6.1042 |
|
64 |
+
| 6.2336 | 16.0 | 144 | 6.1024 |
|
65 |
+
| 6.2497 | 17.0 | 153 | 6.1006 |
|
66 |
+
| 6.1464 | 18.0 | 162 | 6.0988 |
|
67 |
+
| 6.1856 | 19.0 | 171 | 6.0971 |
|
68 |
+
| 6.1388 | 20.0 | 180 | 6.0955 |
|
69 |
+
| 6.2087 | 21.0 | 189 | 6.0938 |
|
70 |
+
| 6.1726 | 22.0 | 198 | 6.0922 |
|
71 |
+
| 6.2043 | 23.0 | 207 | 6.0905 |
|
72 |
+
| 6.2182 | 24.0 | 216 | 6.0889 |
|
73 |
+
| 6.1942 | 25.0 | 225 | 6.0873 |
|
74 |
+
| 6.2171 | 26.0 | 234 | 6.0857 |
|
75 |
+
| 6.199 | 27.0 | 243 | 6.0842 |
|
76 |
+
| 6.1719 | 28.0 | 252 | 6.0826 |
|
77 |
+
| 6.2139 | 29.0 | 261 | 6.0811 |
|
78 |
+
| 6.1818 | 30.0 | 270 | 6.0796 |
|
79 |
+
| 6.2768 | 31.0 | 279 | 6.0782 |
|
80 |
+
| 6.2758 | 32.0 | 288 | 6.0767 |
|
81 |
+
| 6.1451 | 33.0 | 297 | 6.0752 |
|
82 |
+
| 6.1826 | 34.0 | 306 | 6.0738 |
|
83 |
+
| 6.4356 | 35.0 | 315 | 6.0724 |
|
84 |
+
| 6.6278 | 36.0 | 324 | 6.0710 |
|
85 |
+
| 6.2543 | 37.0 | 333 | 6.0697 |
|
86 |
+
| 6.1346 | 38.0 | 342 | 6.0683 |
|
87 |
+
| 6.104 | 39.0 | 351 | 6.0670 |
|
88 |
+
| 6.3697 | 40.0 | 360 | 6.0657 |
|
89 |
+
| 6.27 | 41.0 | 369 | 6.0644 |
|
90 |
+
| 6.1705 | 42.0 | 378 | 6.0632 |
|
91 |
+
| 6.6694 | 43.0 | 387 | 6.0619 |
|
92 |
+
| 6.095 | 44.0 | 396 | 6.0607 |
|
93 |
+
| 6.1732 | 45.0 | 405 | 6.0596 |
|
94 |
+
| 6.205 | 46.0 | 414 | 6.0584 |
|
95 |
+
| 6.1608 | 47.0 | 423 | 6.0573 |
|
96 |
+
| 6.4836 | 48.0 | 432 | 6.0561 |
|
97 |
+
| 6.2704 | 49.0 | 441 | 6.0551 |
|
98 |
+
| 6.2792 | 50.0 | 450 | 6.0540 |
|
99 |
+
| 6.4469 | 51.0 | 459 | 6.0530 |
|
100 |
+
| 6.1758 | 52.0 | 468 | 6.0520 |
|
101 |
+
| 6.1465 | 53.0 | 477 | 6.0510 |
|
102 |
+
| 6.1876 | 54.0 | 486 | 6.0501 |
|
103 |
+
| 6.1449 | 55.0 | 495 | 6.0492 |
|
104 |
+
| 6.4543 | 56.0 | 504 | 6.0483 |
|
105 |
+
| 6.1557 | 57.0 | 513 | 6.0474 |
|
106 |
+
| 6.0813 | 58.0 | 522 | 6.0465 |
|
107 |
+
| 6.2087 | 59.0 | 531 | 6.0457 |
|
108 |
+
| 6.1544 | 60.0 | 540 | 6.0449 |
|
109 |
+
| 6.1211 | 61.0 | 549 | 6.0441 |
|
110 |
+
| 6.227 | 62.0 | 558 | 6.0434 |
|
111 |
+
| 6.1766 | 63.0 | 567 | 6.0427 |
|
112 |
+
| 6.1367 | 64.0 | 576 | 6.0420 |
|
113 |
+
| 6.0628 | 65.0 | 585 | 6.0413 |
|
114 |
+
| 6.2588 | 66.0 | 594 | 6.0407 |
|
115 |
+
| 6.2124 | 67.0 | 603 | 6.0401 |
|
116 |
+
| 6.1432 | 68.0 | 612 | 6.0396 |
|
117 |
+
| 6.5006 | 69.0 | 621 | 6.0390 |
|
118 |
+
| 6.1496 | 70.0 | 630 | 6.0385 |
|
119 |
+
| 6.6739 | 71.0 | 639 | 6.0380 |
|
120 |
+
| 6.4409 | 72.0 | 648 | 6.0376 |
|
121 |
+
| 6.2847 | 73.0 | 657 | 6.0371 |
|
122 |
+
| 6.2227 | 74.0 | 666 | 6.0367 |
|
123 |
+
| 6.8125 | 75.0 | 675 | 6.0362 |
|
124 |
+
| 6.0813 | 76.0 | 684 | 6.0358 |
|
125 |
+
| 6.1578 | 77.0 | 693 | 6.0355 |
|
126 |
+
| 6.0566 | 78.0 | 702 | 6.0351 |
|
127 |
+
| 6.1073 | 79.0 | 711 | 6.0348 |
|
128 |
+
| 6.4278 | 80.0 | 720 | 6.0345 |
|
129 |
+
| 6.4848 | 81.0 | 729 | 6.0342 |
|
130 |
+
| 6.1529 | 82.0 | 738 | 6.0339 |
|
131 |
+
| 6.0452 | 83.0 | 747 | 6.0337 |
|
132 |
+
| 6.2759 | 84.0 | 756 | 6.0335 |
|
133 |
+
| 6.1353 | 85.0 | 765 | 6.0333 |
|
134 |
+
| 6.0707 | 86.0 | 774 | 6.0332 |
|
135 |
+
| 6.3354 | 87.0 | 783 | 6.0331 |
|
136 |
+
| 6.1905 | 88.0 | 792 | 6.0329 |
|
137 |
+
| 6.1461 | 89.0 | 801 | 6.0328 |
|
138 |
+
| 6.1297 | 90.0 | 810 | 6.0327 |
|
139 |
+
| 6.0712 | 91.0 | 819 | 6.0327 |
|
140 |
+
| 6.0914 | 92.0 | 828 | 6.0326 |
|
141 |
+
| 6.1984 | 93.0 | 837 | 6.0326 |
|
142 |
+
| 6.2803 | 94.0 | 846 | 6.0325 |
|
143 |
+
| 6.1384 | 95.0 | 855 | 6.0325 |
|
144 |
+
| 6.1453 | 96.0 | 864 | 6.0325 |
|
145 |
+
| 6.4472 | 97.0 | 873 | 6.0325 |
|
146 |
+
| 6.1691 | 98.0 | 882 | 6.0325 |
|
147 |
+
| 6.0688 | 99.0 | 891 | 6.0325 |
|
148 |
+
| 6.2621 | 100.0 | 900 | 6.0325 |
|
149 |
|
150 |
|
151 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 44381360
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2124d3b6f79a9e8b1df55627eaf9abfc184211d28be332c2b162544feadac11e
|
3 |
size 44381360
|
runs/Mar06_09-07-05_Software-AI/events.out.tfevents.1709703425.Software-AI.118212.4
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:579a4cd682d81e787feb5349bf68cdcaca5cfc73b98411347745abc82009fdba
|
3 |
+
size 47649
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 4219
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b2f8ee214f5b473c2d14511cd0879885299abde787c6c074e2a1f2f989b7e0c1
|
3 |
size 4219
|