layoutlm-sroie-dacn_v1
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1261
- Address: {'precision': 0.9915730337078652, 'recall': 0.993857179421551, 'f1': 0.9927137926626612, 'number': 3907}
- Company: {'precision': 0.9749670619235836, 'recall': 0.9926224010731053, 'f1': 0.9837155201063477, 'number': 1491}
- Date: {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428}
- Total: {'precision': 0.88, 'recall': 0.889487870619946, 'f1': 0.8847184986595173, 'number': 371}
- Overall Precision: 0.9812
- Overall Recall: 0.9871
- Overall F1: 0.9842
- Overall Accuracy: 0.9952
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- label_smoothing_factor: 0.02
Training results
Training Loss | Epoch | Step | Validation Loss | Address | Company | Date | Total | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|
0.3889 | 1.0 | 40 | 0.1566 | {'precision': 0.980352644836272, 'recall': 0.9961607371384694, 'f1': 0.9881934746730989, 'number': 3907} | {'precision': 0.8888888888888888, 'recall': 0.9872568745808182, 'f1': 0.9354941213854464, 'number': 1491} | {'precision': 0.821917808219178, 'recall': 0.9813084112149533, 'f1': 0.8945686900958466, 'number': 428} | {'precision': 0.7297297297297297, 'recall': 0.2183288409703504, 'f1': 0.3360995850622407, 'number': 371} | 0.9387 | 0.9464 | 0.9425 | 0.9826 |
0.1424 | 2.0 | 80 | 0.1317 | {'precision': 0.9943430187708923, 'recall': 0.9897619657025851, 'f1': 0.9920472036942023, 'number': 3907} | {'precision': 0.9666448659254414, 'recall': 0.9912810194500336, 'f1': 0.9788079470198675, 'number': 1491} | {'precision': 0.9766899766899767, 'recall': 0.9789719626168224, 'f1': 0.9778296382730455, 'number': 428} | {'precision': 0.65, 'recall': 0.8409703504043127, 'f1': 0.7332549941245595, 'number': 371} | 0.9603 | 0.9805 | 0.9703 | 0.9909 |
0.127 | 3.0 | 120 | 0.1261 | {'precision': 0.9915665729619217, 'recall': 0.9930893268492449, 'f1': 0.9923273657289002, 'number': 3907} | {'precision': 0.9710906701708278, 'recall': 0.9912810194500336, 'f1': 0.981081978094922, 'number': 1491} | {'precision': 0.9905660377358491, 'recall': 0.9813084112149533, 'f1': 0.9859154929577464, 'number': 428} | {'precision': 0.8130081300813008, 'recall': 0.8086253369272237, 'f1': 0.8108108108108109, 'number': 371} | 0.9759 | 0.9808 | 0.9784 | 0.9935 |
0.1204 | 4.0 | 160 | 0.1278 | {'precision': 0.9865550481988838, 'recall': 0.9953928845661633, 'f1': 0.9909542616893872, 'number': 3907} | {'precision': 0.9737015121630507, 'recall': 0.9932930918846412, 'f1': 0.9833997343957503, 'number': 1491} | {'precision': 0.997624703087886, 'recall': 0.9813084112149533, 'f1': 0.9893992932862191, 'number': 428} | {'precision': 0.7989556135770235, 'recall': 0.8247978436657682, 'f1': 0.8116710875331564, 'number': 371} | 0.9727 | 0.9837 | 0.9782 | 0.9934 |
0.1176 | 5.0 | 200 | 0.1246 | {'precision': 0.9915708812260536, 'recall': 0.9936012285641157, 'f1': 0.9925850166197903, 'number': 3907} | {'precision': 0.9691803278688524, 'recall': 0.9912810194500336, 'f1': 0.980106100795756, 'number': 1491} | {'precision': 0.9952718676122931, 'recall': 0.9836448598130841, 'f1': 0.9894242068155112, 'number': 428} | {'precision': 0.8050632911392405, 'recall': 0.8571428571428571, 'f1': 0.8302872062663185, 'number': 371} | 0.9746 | 0.9842 | 0.9794 | 0.9939 |
0.1151 | 6.0 | 240 | 0.1246 | {'precision': 0.9915816326530612, 'recall': 0.9948809828512926, 'f1': 0.9932285677782037, 'number': 3907} | {'precision': 0.9736668861092824, 'recall': 0.9919517102615694, 'f1': 0.9827242524916943, 'number': 1491} | {'precision': 0.997624703087886, 'recall': 0.9813084112149533, 'f1': 0.9893992932862191, 'number': 428} | {'precision': 0.8381201044386423, 'recall': 0.8652291105121294, 'f1': 0.8514588859416445, 'number': 371} | 0.9782 | 0.9855 | 0.9818 | 0.9945 |
0.1133 | 7.0 | 280 | 0.1246 | {'precision': 0.9918179493735617, 'recall': 0.9928333759918095, 'f1': 0.9923254029163469, 'number': 3907} | {'precision': 0.9762532981530343, 'recall': 0.9926224010731053, 'f1': 0.9843698037911539, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.8514588859416445, 'recall': 0.8652291105121294, 'f1': 0.8582887700534758, 'number': 371} | 0.9801 | 0.9847 | 0.9824 | 0.9946 |
0.1126 | 8.0 | 320 | 0.1244 | {'precision': 0.9915773353751914, 'recall': 0.9943690811364219, 'f1': 0.9929712460063898, 'number': 3907} | {'precision': 0.9736842105263158, 'recall': 0.9926224010731053, 'f1': 0.9830621056127532, 'number': 1491} | {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} | {'precision': 0.8423772609819121, 'recall': 0.8787061994609164, 'f1': 0.8601583113456465, 'number': 371} | 0.9786 | 0.9868 | 0.9826 | 0.9947 |
0.1106 | 9.0 | 360 | 0.1245 | {'precision': 0.9905708460754332, 'recall': 0.9948809828512926, 'f1': 0.9927212361128847, 'number': 3907} | {'precision': 0.9832999331997327, 'recall': 0.9872568745808182, 'f1': 0.9852744310575635, 'number': 1491} | {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} | {'precision': 0.9106628242074928, 'recall': 0.8517520215633423, 'f1': 0.8802228412256267, 'number': 371} | 0.9850 | 0.9842 | 0.9846 | 0.9953 |
0.1104 | 10.0 | 400 | 0.1252 | {'precision': 0.9918242207460398, 'recall': 0.9936012285641157, 'f1': 0.9927119294207902, 'number': 3907} | {'precision': 0.9736147757255936, 'recall': 0.9899396378269618, 'f1': 0.9817093448619887, 'number': 1491} | {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428} | {'precision': 0.8885793871866295, 'recall': 0.8598382749326146, 'f1': 0.873972602739726, 'number': 371} | 0.9818 | 0.9845 | 0.9832 | 0.9949 |
0.1094 | 11.0 | 440 | 0.1257 | {'precision': 0.9918263090676884, 'recall': 0.993857179421551, 'f1': 0.9928407057018664, 'number': 3907} | {'precision': 0.9754316069057105, 'recall': 0.9852448021462106, 'f1': 0.9803136469803136, 'number': 1491} | {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} | {'precision': 0.8849315068493151, 'recall': 0.8706199460916442, 'f1': 0.8777173913043478, 'number': 371} | 0.9821 | 0.9842 | 0.9832 | 0.9949 |
0.11 | 12.0 | 480 | 0.1261 | {'precision': 0.9908233494774408, 'recall': 0.9948809828512926, 'f1': 0.9928480204342274, 'number': 3907} | {'precision': 0.9742744063324539, 'recall': 0.9906103286384976, 'f1': 0.98237445959428, 'number': 1491} | {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} | {'precision': 0.8659517426273459, 'recall': 0.8706199460916442, 'f1': 0.8682795698924732, 'number': 371} | 0.9800 | 0.9861 | 0.9830 | 0.9948 |
0.1088 | 13.0 | 520 | 0.1256 | {'precision': 0.9915730337078652, 'recall': 0.993857179421551, 'f1': 0.9927137926626612, 'number': 3907} | {'precision': 0.97556142668428, 'recall': 0.9906103286384976, 'f1': 0.9830282861896837, 'number': 1491} | {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} | {'precision': 0.8703703703703703, 'recall': 0.8867924528301887, 'f1': 0.8785046728971964, 'number': 371} | 0.9809 | 0.9864 | 0.9837 | 0.9950 |
0.1084 | 14.0 | 560 | 0.1255 | {'precision': 0.9910668708524758, 'recall': 0.993857179421551, 'f1': 0.9924600638977635, 'number': 3907} | {'precision': 0.9755775577557756, 'recall': 0.9912810194500336, 'f1': 0.9833666001330672, 'number': 1491} | {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} | {'precision': 0.8661417322834646, 'recall': 0.889487870619946, 'f1': 0.8776595744680851, 'number': 371} | 0.9803 | 0.9868 | 0.9835 | 0.9950 |
0.1079 | 15.0 | 600 | 0.1252 | {'precision': 0.9908116385911179, 'recall': 0.9936012285641157, 'f1': 0.9922044728434504, 'number': 3907} | {'precision': 0.9762062128222075, 'recall': 0.9906103286384976, 'f1': 0.9833555259653795, 'number': 1491} | {'precision': 1.0, 'recall': 0.9883177570093458, 'f1': 0.9941245593419507, 'number': 428} | {'precision': 0.9030470914127424, 'recall': 0.8787061994609164, 'f1': 0.8907103825136613, 'number': 371} | 0.9828 | 0.9856 | 0.9842 | 0.9952 |
0.1077 | 16.0 | 640 | 0.1258 | {'precision': 0.9900484817555499, 'recall': 0.9930893268492449, 'f1': 0.9915665729619217, 'number': 3907} | {'precision': 0.9755775577557756, 'recall': 0.9912810194500336, 'f1': 0.9833666001330672, 'number': 1491} | {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428} | {'precision': 0.8885869565217391, 'recall': 0.8814016172506739, 'f1': 0.884979702300406, 'number': 371} | 0.9811 | 0.9858 | 0.9834 | 0.9950 |
0.1077 | 17.0 | 680 | 0.1265 | {'precision': 0.9908069458631257, 'recall': 0.9930893268492449, 'f1': 0.9919468234692573, 'number': 3907} | {'precision': 0.97556142668428, 'recall': 0.9906103286384976, 'f1': 0.9830282861896837, 'number': 1491} | {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428} | {'precision': 0.8796791443850267, 'recall': 0.8867924528301887, 'f1': 0.8832214765100671, 'number': 371} | 0.9809 | 0.9860 | 0.9834 | 0.9950 |
0.1075 | 18.0 | 720 | 0.1262 | {'precision': 0.9915687276443536, 'recall': 0.9933452777066804, 'f1': 0.9924562076460811, 'number': 3907} | {'precision': 0.9749835418038183, 'recall': 0.9932930918846412, 'f1': 0.9840531561461794, 'number': 1491} | {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428} | {'precision': 0.8858695652173914, 'recall': 0.8787061994609164, 'f1': 0.8822733423545331, 'number': 371} | 0.9817 | 0.9863 | 0.9840 | 0.9951 |
0.1075 | 19.0 | 760 | 0.1262 | {'precision': 0.9915730337078652, 'recall': 0.993857179421551, 'f1': 0.9927137926626612, 'number': 3907} | {'precision': 0.9749670619235836, 'recall': 0.9926224010731053, 'f1': 0.9837155201063477, 'number': 1491} | {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428} | {'precision': 0.8823529411764706, 'recall': 0.889487870619946, 'f1': 0.8859060402684563, 'number': 371} | 0.9814 | 0.9871 | 0.9842 | 0.9952 |
0.1074 | 20.0 | 800 | 0.1261 | {'precision': 0.9915730337078652, 'recall': 0.993857179421551, 'f1': 0.9927137926626612, 'number': 3907} | {'precision': 0.9749670619235836, 'recall': 0.9926224010731053, 'f1': 0.9837155201063477, 'number': 1491} | {'precision': 0.9976470588235294, 'recall': 0.9906542056074766, 'f1': 0.9941383352872216, 'number': 428} | {'precision': 0.88, 'recall': 0.889487870619946, 'f1': 0.8847184986595173, 'number': 371} | 0.9812 | 0.9871 | 0.9842 | 0.9952 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.13.3
- Downloads last month
- 34
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.