jfrish commited on
Commit
1ff7a12
·
verified ·
1 Parent(s): ff08aec

End of training

Browse files
README.md ADDED
@@ -0,0 +1,106 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: cc-by-nc-sa-4.0
4
+ base_model: microsoft/layoutlmv3-base
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - layoutlmv3
9
+ metrics:
10
+ - precision
11
+ - recall
12
+ - f1
13
+ - accuracy
14
+ model-index:
15
+ - name: layoutlm-CC-7
16
+ results:
17
+ - task:
18
+ name: Token Classification
19
+ type: token-classification
20
+ dataset:
21
+ name: layoutlmv3
22
+ type: layoutlmv3
23
+ config: FormsDataset
24
+ split: test
25
+ args: FormsDataset
26
+ metrics:
27
+ - name: Precision
28
+ type: precision
29
+ value: 0.12529002320185614
30
+ - name: Recall
31
+ type: recall
32
+ value: 0.20224719101123595
33
+ - name: F1
34
+ type: f1
35
+ value: 0.15472779369627507
36
+ - name: Accuracy
37
+ type: accuracy
38
+ value: 0.19654427645788336
39
+ ---
40
+
41
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
42
+ should probably proofread and complete it, then remove this comment. -->
43
+
44
+ # layoutlm-CC-7
45
+
46
+ This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the layoutlmv3 dataset.
47
+ It achieves the following results on the evaluation set:
48
+ - Loss: 4.1612
49
+ - Precision: 0.1253
50
+ - Recall: 0.2022
51
+ - F1: 0.1547
52
+ - Accuracy: 0.1965
53
+
54
+ ## Model description
55
+
56
+ More information needed
57
+
58
+ ## Intended uses & limitations
59
+
60
+ More information needed
61
+
62
+ ## Training and evaluation data
63
+
64
+ More information needed
65
+
66
+ ## Training procedure
67
+
68
+ ### Training hyperparameters
69
+
70
+ The following hyperparameters were used during training:
71
+ - learning_rate: 3e-05
72
+ - train_batch_size: 16
73
+ - eval_batch_size: 8
74
+ - seed: 42
75
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
76
+ - lr_scheduler_type: linear
77
+ - num_epochs: 15
78
+ - mixed_precision_training: Native AMP
79
+
80
+ ### Training results
81
+
82
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
83
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
84
+ | 4.8141 | 1.0 | 1 | 4.7205 | 0.0921 | 0.1311 | 0.1082 | 0.0821 |
85
+ | 4.7028 | 2.0 | 2 | 4.6365 | 0.1414 | 0.2022 | 0.1664 | 0.1425 |
86
+ | 4.6011 | 3.0 | 3 | 4.5617 | 0.1230 | 0.2022 | 0.1530 | 0.1274 |
87
+ | 4.5126 | 4.0 | 4 | 4.4931 | 0.1174 | 0.2022 | 0.1486 | 0.1231 |
88
+ | 4.4376 | 5.0 | 5 | 4.4390 | 0.1166 | 0.2022 | 0.1479 | 0.1166 |
89
+ | 4.3778 | 6.0 | 6 | 4.3926 | 0.1166 | 0.2022 | 0.1479 | 0.1188 |
90
+ | 4.3224 | 7.0 | 7 | 4.3454 | 0.1166 | 0.2022 | 0.1479 | 0.1210 |
91
+ | 4.2658 | 8.0 | 8 | 4.3058 | 0.1166 | 0.2022 | 0.1479 | 0.1253 |
92
+ | 4.2182 | 9.0 | 9 | 4.2708 | 0.1179 | 0.2022 | 0.1490 | 0.1425 |
93
+ | 4.1796 | 10.0 | 10 | 4.2415 | 0.1208 | 0.2022 | 0.1513 | 0.1641 |
94
+ | 4.1423 | 11.0 | 11 | 4.2165 | 0.1222 | 0.2022 | 0.1523 | 0.1728 |
95
+ | 4.1197 | 12.0 | 12 | 4.1951 | 0.1230 | 0.2022 | 0.1530 | 0.1793 |
96
+ | 4.0976 | 13.0 | 13 | 4.1782 | 0.1241 | 0.2022 | 0.1538 | 0.1922 |
97
+ | 4.0801 | 14.0 | 14 | 4.1669 | 0.1253 | 0.2022 | 0.1547 | 0.1965 |
98
+ | 4.0627 | 15.0 | 15 | 4.1612 | 0.1253 | 0.2022 | 0.1547 | 0.1965 |
99
+
100
+
101
+ ### Framework versions
102
+
103
+ - Transformers 4.47.0.dev0
104
+ - Pytorch 2.5.1+cu121
105
+ - Datasets 3.1.0
106
+ - Tokenizers 0.20.3
config.json ADDED
@@ -0,0 +1,294 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "microsoft/layoutlmv3-base",
3
+ "architectures": [
4
+ "LayoutLMv3ForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "coordinate_size": 128,
10
+ "eos_token_id": 2,
11
+ "has_relative_attention_bias": true,
12
+ "has_spatial_attention_bias": true,
13
+ "hidden_act": "gelu",
14
+ "hidden_dropout_prob": 0.1,
15
+ "hidden_size": 768,
16
+ "id2label": {
17
+ "0": "0",
18
+ "1": "B-AMOUNT_FINANCED",
19
+ "2": "B-AVERAGE_BANK_BALANCE",
20
+ "3": "B-COMPANY_ADDRESS",
21
+ "4": "B-COMPANY_CITY",
22
+ "5": "B-COMPANY_NAME",
23
+ "6": "B-COMPANY_PHONE",
24
+ "7": "B-COMPANY_STATE",
25
+ "8": "B-COMPANY_ZIP",
26
+ "9": "B-CONTACT_EMAIL",
27
+ "10": "B-CONTACT_NAME",
28
+ "11": "B-CONTACT_PHONE_NUM",
29
+ "12": "B-EQUIPMENT",
30
+ "13": "B-EQUIPMENT_ADDRESS",
31
+ "14": "B-EQUIPMENT_CITY",
32
+ "15": "B-EQUIPMENT_STATE",
33
+ "16": "B-EQUIPMENT_ZIP",
34
+ "17": "B-FEDERAL_TAX_ID",
35
+ "18": "B-Header",
36
+ "19": "B-NUMBER_EMPLOYEES",
37
+ "20": "B-NUM_YEARS_IN_BUSINESS",
38
+ "21": "B-PRINCIPAL_1_ADDRESS",
39
+ "22": "B-PRINCIPAL_1_CITY",
40
+ "23": "B-PRINCIPAL_1_EMAIL",
41
+ "24": "B-PRINCIPAL_1_NAME",
42
+ "25": "B-PRINCIPAL_1_OWNERSHIP",
43
+ "26": "B-PRINCIPAL_1_SSN",
44
+ "27": "B-PRINCIPAL_1_STATE",
45
+ "28": "B-PRINCIPAL_1_TITLE",
46
+ "29": "B-PRINCIPAL_1_ZIP",
47
+ "30": "B-PRINCIPAL_2_ADDRESS",
48
+ "31": "B-PRINCIPAL_2_CITY",
49
+ "32": "B-PRINCIPAL_2_EMAIL",
50
+ "33": "B-PRINCIPAL_2_NAME",
51
+ "34": "B-PRINCIPAL_2_OWNERSHIP",
52
+ "35": "B-PRINCIPAL_2_SSN",
53
+ "36": "B-PRINCIPAL_2_STATE",
54
+ "37": "B-PRINCIPAL_2_TITLE",
55
+ "38": "B-PRINCIPAL_2_ZIP",
56
+ "39": "B-QUESTION_AMOUNT_FINANCED",
57
+ "40": "B-QUESTION_AVERAGE_BANK_BALANCE",
58
+ "41": "B-QUESTION_COMPANY_ADDRESS",
59
+ "42": "B-QUESTION_COMPANY_CITY",
60
+ "43": "B-QUESTION_COMPANY_NAME",
61
+ "44": "B-QUESTION_COMPANY_PHONE",
62
+ "45": "B-QUESTION_COMPANY_STATE",
63
+ "46": "B-QUESTION_COMPANY_ZIP",
64
+ "47": "B-QUESTION_CONTACT_EMAIL",
65
+ "48": "B-QUESTION_CONTACT_NAME",
66
+ "49": "B-QUESTION_CONTACT_PHONE_NUM",
67
+ "50": "B-QUESTION_EQUIPMENT",
68
+ "51": "B-QUESTION_EQUIPMENT_ADDRESS",
69
+ "52": "B-QUESTION_EQUIPMENT_CITY",
70
+ "53": "B-QUESTION_EQUIPMENT_STATE",
71
+ "54": "B-QUESTION_EQUIPMENT_ZIP",
72
+ "55": "B-QUESTION_FEDERAL_TAX_ID",
73
+ "56": "B-QUESTION_NUMBER_EMPLOYEES",
74
+ "57": "B-QUESTION_NUM_YEARS_IN_BUSINESS",
75
+ "58": "B-QUESTION_PRINCIPAL_1_ADDRESS",
76
+ "59": "B-QUESTION_PRINCIPAL_1_CITY",
77
+ "60": "B-QUESTION_PRINCIPAL_1_EMAIL",
78
+ "61": "B-QUESTION_PRINCIPAL_1_NAME",
79
+ "62": "B-QUESTION_PRINCIPAL_1_OWNERSHIP",
80
+ "63": "B-QUESTION_PRINCIPAL_1_SSN",
81
+ "64": "B-QUESTION_PRINCIPAL_1_STATE",
82
+ "65": "B-QUESTION_PRINCIPAL_1_TITLE",
83
+ "66": "B-QUESTION_PRINCIPAL_1_ZIP",
84
+ "67": "B-QUESTION_PRINCIPAL_2_ADDRESS",
85
+ "68": "B-QUESTION_PRINCIPAL_2_CITY",
86
+ "69": "B-QUESTION_PRINCIPAL_2_EMAIL",
87
+ "70": "B-QUESTION_PRINCIPAL_2_NAME",
88
+ "71": "B-QUESTION_PRINCIPAL_2_OWNERSHIP",
89
+ "72": "B-QUESTION_PRINCIPAL_2_SSN",
90
+ "73": "B-QUESTION_PRINCIPAL_2_STATE",
91
+ "74": "B-QUESTION_PRINCIPAL_2_TITLE",
92
+ "75": "B-QUESTION_PRINCIPAL_2_ZIP",
93
+ "76": "B-QUESTION_SALES_REP_NAME",
94
+ "77": "B-QUESTION_YEARS_CUSTOMER",
95
+ "78": "B-SALES_REP_NAME",
96
+ "79": "B-YEARS_CUSTOMER",
97
+ "80": "I-AVERAGE_BANK_BALANCE",
98
+ "81": "I-COMPANY_ADDRESS",
99
+ "82": "I-COMPANY_NAME",
100
+ "83": "I-COMPANY_PHONE",
101
+ "84": "I-CONTACT_EMAIL",
102
+ "85": "I-CONTACT_NAME",
103
+ "86": "I-CONTACT_PHONE_NUM",
104
+ "87": "I-EQUIPMENT",
105
+ "88": "I-EQUIPMENT_ADDRESS",
106
+ "89": "I-FEDERAL_TAX_ID",
107
+ "90": "I-PRINCIPAL_1_ADDRESS",
108
+ "91": "I-PRINCIPAL_1_EMAIL",
109
+ "92": "I-PRINCIPAL_1_NAME",
110
+ "93": "I-PRINCIPAL_1_SSN",
111
+ "94": "I-PRINCIPAL_1_TITLE",
112
+ "95": "I-PRINCIPAL_2_ADDRESS",
113
+ "96": "I-PRINCIPAL_2_EMAIL",
114
+ "97": "I-PRINCIPAL_2_NAME",
115
+ "98": "I-PRINCIPAL_2_SSN",
116
+ "99": "I-PRINCIPAL_2_TITLE",
117
+ "100": "I-QUESTION_AMOUNT_FINANCED",
118
+ "101": "I-QUESTION_AVERAGE_BANK_BALANCE",
119
+ "102": "I-QUESTION_COMPANY_ADDRESS",
120
+ "103": "I-QUESTION_COMPANY_NAME",
121
+ "104": "I-QUESTION_COMPANY_PHONE",
122
+ "105": "I-QUESTION_CONTACT_EMAIL",
123
+ "106": "I-QUESTION_CONTACT_NAME",
124
+ "107": "I-QUESTION_CONTACT_PHONE_NUM",
125
+ "108": "I-QUESTION_EQUIPMENT",
126
+ "109": "I-QUESTION_EQUIPMENT_ADDRESS",
127
+ "110": "I-QUESTION_FEDERAL_TAX_ID",
128
+ "111": "I-QUESTION_NUMBER_EMPLOYEES",
129
+ "112": "I-QUESTION_NUM_YEARS_IN_BUSINESS",
130
+ "113": "I-QUESTION_PRINCIPAL_1_ADDRESS",
131
+ "114": "I-QUESTION_PRINCIPAL_1_EMAIL",
132
+ "115": "I-QUESTION_PRINCIPAL_1_OWNERSHIP",
133
+ "116": "I-QUESTION_PRINCIPAL_1_SSN",
134
+ "117": "I-QUESTION_PRINCIPAL_2_ADDRESS",
135
+ "118": "I-QUESTION_PRINCIPAL_2_EMAIL",
136
+ "119": "I-QUESTION_PRINCIPAL_2_OWNERSHIP",
137
+ "120": "I-QUESTION_PRINCIPAL_2_SSN",
138
+ "121": "I-QUESTION_SALES_REP_NAME",
139
+ "122": "I-QUESTION_YEARS_CUSTOMER",
140
+ "123": "I-SALES_REP_NAME",
141
+ "124": "I-YEARS_CUSTOMER"
142
+ },
143
+ "initializer_range": 0.02,
144
+ "input_size": 224,
145
+ "intermediate_size": 3072,
146
+ "label2id": {
147
+ "0": 0,
148
+ "B-AMOUNT_FINANCED": 1,
149
+ "B-AVERAGE_BANK_BALANCE": 2,
150
+ "B-COMPANY_ADDRESS": 3,
151
+ "B-COMPANY_CITY": 4,
152
+ "B-COMPANY_NAME": 5,
153
+ "B-COMPANY_PHONE": 6,
154
+ "B-COMPANY_STATE": 7,
155
+ "B-COMPANY_ZIP": 8,
156
+ "B-CONTACT_EMAIL": 9,
157
+ "B-CONTACT_NAME": 10,
158
+ "B-CONTACT_PHONE_NUM": 11,
159
+ "B-EQUIPMENT": 12,
160
+ "B-EQUIPMENT_ADDRESS": 13,
161
+ "B-EQUIPMENT_CITY": 14,
162
+ "B-EQUIPMENT_STATE": 15,
163
+ "B-EQUIPMENT_ZIP": 16,
164
+ "B-FEDERAL_TAX_ID": 17,
165
+ "B-Header": 18,
166
+ "B-NUMBER_EMPLOYEES": 19,
167
+ "B-NUM_YEARS_IN_BUSINESS": 20,
168
+ "B-PRINCIPAL_1_ADDRESS": 21,
169
+ "B-PRINCIPAL_1_CITY": 22,
170
+ "B-PRINCIPAL_1_EMAIL": 23,
171
+ "B-PRINCIPAL_1_NAME": 24,
172
+ "B-PRINCIPAL_1_OWNERSHIP": 25,
173
+ "B-PRINCIPAL_1_SSN": 26,
174
+ "B-PRINCIPAL_1_STATE": 27,
175
+ "B-PRINCIPAL_1_TITLE": 28,
176
+ "B-PRINCIPAL_1_ZIP": 29,
177
+ "B-PRINCIPAL_2_ADDRESS": 30,
178
+ "B-PRINCIPAL_2_CITY": 31,
179
+ "B-PRINCIPAL_2_EMAIL": 32,
180
+ "B-PRINCIPAL_2_NAME": 33,
181
+ "B-PRINCIPAL_2_OWNERSHIP": 34,
182
+ "B-PRINCIPAL_2_SSN": 35,
183
+ "B-PRINCIPAL_2_STATE": 36,
184
+ "B-PRINCIPAL_2_TITLE": 37,
185
+ "B-PRINCIPAL_2_ZIP": 38,
186
+ "B-QUESTION_AMOUNT_FINANCED": 39,
187
+ "B-QUESTION_AVERAGE_BANK_BALANCE": 40,
188
+ "B-QUESTION_COMPANY_ADDRESS": 41,
189
+ "B-QUESTION_COMPANY_CITY": 42,
190
+ "B-QUESTION_COMPANY_NAME": 43,
191
+ "B-QUESTION_COMPANY_PHONE": 44,
192
+ "B-QUESTION_COMPANY_STATE": 45,
193
+ "B-QUESTION_COMPANY_ZIP": 46,
194
+ "B-QUESTION_CONTACT_EMAIL": 47,
195
+ "B-QUESTION_CONTACT_NAME": 48,
196
+ "B-QUESTION_CONTACT_PHONE_NUM": 49,
197
+ "B-QUESTION_EQUIPMENT": 50,
198
+ "B-QUESTION_EQUIPMENT_ADDRESS": 51,
199
+ "B-QUESTION_EQUIPMENT_CITY": 52,
200
+ "B-QUESTION_EQUIPMENT_STATE": 53,
201
+ "B-QUESTION_EQUIPMENT_ZIP": 54,
202
+ "B-QUESTION_FEDERAL_TAX_ID": 55,
203
+ "B-QUESTION_NUMBER_EMPLOYEES": 56,
204
+ "B-QUESTION_NUM_YEARS_IN_BUSINESS": 57,
205
+ "B-QUESTION_PRINCIPAL_1_ADDRESS": 58,
206
+ "B-QUESTION_PRINCIPAL_1_CITY": 59,
207
+ "B-QUESTION_PRINCIPAL_1_EMAIL": 60,
208
+ "B-QUESTION_PRINCIPAL_1_NAME": 61,
209
+ "B-QUESTION_PRINCIPAL_1_OWNERSHIP": 62,
210
+ "B-QUESTION_PRINCIPAL_1_SSN": 63,
211
+ "B-QUESTION_PRINCIPAL_1_STATE": 64,
212
+ "B-QUESTION_PRINCIPAL_1_TITLE": 65,
213
+ "B-QUESTION_PRINCIPAL_1_ZIP": 66,
214
+ "B-QUESTION_PRINCIPAL_2_ADDRESS": 67,
215
+ "B-QUESTION_PRINCIPAL_2_CITY": 68,
216
+ "B-QUESTION_PRINCIPAL_2_EMAIL": 69,
217
+ "B-QUESTION_PRINCIPAL_2_NAME": 70,
218
+ "B-QUESTION_PRINCIPAL_2_OWNERSHIP": 71,
219
+ "B-QUESTION_PRINCIPAL_2_SSN": 72,
220
+ "B-QUESTION_PRINCIPAL_2_STATE": 73,
221
+ "B-QUESTION_PRINCIPAL_2_TITLE": 74,
222
+ "B-QUESTION_PRINCIPAL_2_ZIP": 75,
223
+ "B-QUESTION_SALES_REP_NAME": 76,
224
+ "B-QUESTION_YEARS_CUSTOMER": 77,
225
+ "B-SALES_REP_NAME": 78,
226
+ "B-YEARS_CUSTOMER": 79,
227
+ "I-AVERAGE_BANK_BALANCE": 80,
228
+ "I-COMPANY_ADDRESS": 81,
229
+ "I-COMPANY_NAME": 82,
230
+ "I-COMPANY_PHONE": 83,
231
+ "I-CONTACT_EMAIL": 84,
232
+ "I-CONTACT_NAME": 85,
233
+ "I-CONTACT_PHONE_NUM": 86,
234
+ "I-EQUIPMENT": 87,
235
+ "I-EQUIPMENT_ADDRESS": 88,
236
+ "I-FEDERAL_TAX_ID": 89,
237
+ "I-PRINCIPAL_1_ADDRESS": 90,
238
+ "I-PRINCIPAL_1_EMAIL": 91,
239
+ "I-PRINCIPAL_1_NAME": 92,
240
+ "I-PRINCIPAL_1_SSN": 93,
241
+ "I-PRINCIPAL_1_TITLE": 94,
242
+ "I-PRINCIPAL_2_ADDRESS": 95,
243
+ "I-PRINCIPAL_2_EMAIL": 96,
244
+ "I-PRINCIPAL_2_NAME": 97,
245
+ "I-PRINCIPAL_2_SSN": 98,
246
+ "I-PRINCIPAL_2_TITLE": 99,
247
+ "I-QUESTION_AMOUNT_FINANCED": 100,
248
+ "I-QUESTION_AVERAGE_BANK_BALANCE": 101,
249
+ "I-QUESTION_COMPANY_ADDRESS": 102,
250
+ "I-QUESTION_COMPANY_NAME": 103,
251
+ "I-QUESTION_COMPANY_PHONE": 104,
252
+ "I-QUESTION_CONTACT_EMAIL": 105,
253
+ "I-QUESTION_CONTACT_NAME": 106,
254
+ "I-QUESTION_CONTACT_PHONE_NUM": 107,
255
+ "I-QUESTION_EQUIPMENT": 108,
256
+ "I-QUESTION_EQUIPMENT_ADDRESS": 109,
257
+ "I-QUESTION_FEDERAL_TAX_ID": 110,
258
+ "I-QUESTION_NUMBER_EMPLOYEES": 111,
259
+ "I-QUESTION_NUM_YEARS_IN_BUSINESS": 112,
260
+ "I-QUESTION_PRINCIPAL_1_ADDRESS": 113,
261
+ "I-QUESTION_PRINCIPAL_1_EMAIL": 114,
262
+ "I-QUESTION_PRINCIPAL_1_OWNERSHIP": 115,
263
+ "I-QUESTION_PRINCIPAL_1_SSN": 116,
264
+ "I-QUESTION_PRINCIPAL_2_ADDRESS": 117,
265
+ "I-QUESTION_PRINCIPAL_2_EMAIL": 118,
266
+ "I-QUESTION_PRINCIPAL_2_OWNERSHIP": 119,
267
+ "I-QUESTION_PRINCIPAL_2_SSN": 120,
268
+ "I-QUESTION_SALES_REP_NAME": 121,
269
+ "I-QUESTION_YEARS_CUSTOMER": 122,
270
+ "I-SALES_REP_NAME": 123,
271
+ "I-YEARS_CUSTOMER": 124
272
+ },
273
+ "layer_norm_eps": 1e-05,
274
+ "max_2d_position_embeddings": 1024,
275
+ "max_position_embeddings": 514,
276
+ "max_rel_2d_pos": 256,
277
+ "max_rel_pos": 128,
278
+ "model_type": "layoutlmv3",
279
+ "num_attention_heads": 12,
280
+ "num_channels": 3,
281
+ "num_hidden_layers": 12,
282
+ "pad_token_id": 1,
283
+ "patch_size": 16,
284
+ "rel_2d_pos_bins": 64,
285
+ "rel_pos_bins": 32,
286
+ "second_input_size": 112,
287
+ "shape_size": 128,
288
+ "text_embed": true,
289
+ "torch_dtype": "float32",
290
+ "transformers_version": "4.47.0.dev0",
291
+ "type_vocab_size": 1,
292
+ "visual_embed": true,
293
+ "vocab_size": 50265
294
+ }
logs/events.out.tfevents.1732922524.2cc3b3a5bf49.638.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a780e7b37892df0eb0d64f2ebf4d20bbe0407059afccddf3331238806035d185
3
+ size 24516
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e62c0be87bfc2984c28d1513a632e294c6bda464c91c129ecfa464105166024c
3
+ size 504081100
preprocessor_config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "apply_ocr": true,
3
+ "do_normalize": true,
4
+ "do_rescale": true,
5
+ "do_resize": true,
6
+ "image_mean": [
7
+ 0.5,
8
+ 0.5,
9
+ 0.5
10
+ ],
11
+ "image_processor_type": "LayoutLMv3ImageProcessor",
12
+ "image_std": [
13
+ 0.5,
14
+ 0.5,
15
+ 0.5
16
+ ],
17
+ "ocr_lang": null,
18
+ "processor_class": "LayoutLMv3Processor",
19
+ "resample": 2,
20
+ "rescale_factor": 0.00392156862745098,
21
+ "size": {
22
+ "height": 224,
23
+ "width": 224
24
+ },
25
+ "tesseract_config": ""
26
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": true,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": true,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": true,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<unk>",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "50264": {
37
+ "content": "<mask>",
38
+ "lstrip": true,
39
+ "normalized": true,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ }
44
+ },
45
+ "apply_ocr": false,
46
+ "bos_token": "<s>",
47
+ "clean_up_tokenization_spaces": false,
48
+ "cls_token": "<s>",
49
+ "cls_token_box": [
50
+ 0,
51
+ 0,
52
+ 0,
53
+ 0
54
+ ],
55
+ "eos_token": "</s>",
56
+ "errors": "replace",
57
+ "extra_special_tokens": {},
58
+ "mask_token": "<mask>",
59
+ "model_max_length": 512,
60
+ "only_label_first_subword": true,
61
+ "pad_token": "<pad>",
62
+ "pad_token_box": [
63
+ 0,
64
+ 0,
65
+ 0,
66
+ 0
67
+ ],
68
+ "pad_token_label": -100,
69
+ "processor_class": "LayoutLMv3Processor",
70
+ "sep_token": "</s>",
71
+ "sep_token_box": [
72
+ 0,
73
+ 0,
74
+ 0,
75
+ 0
76
+ ],
77
+ "tokenizer_class": "LayoutLMv3Tokenizer",
78
+ "trim_offsets": true,
79
+ "unk_token": "<unk>"
80
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a08f24aad176d47d80fa10a485df4a8a14a4d1c2216f9a6c0e12cf54660df59
3
+ size 5304
vocab.json ADDED
The diff for this file is too large to render. See raw diff