layoutlm-sroie-v2
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0338
- Address: {'precision': 0.8966480446927374, 'recall': 0.9250720461095101, 'f1': 0.9106382978723405, 'number': 347}
- Company: {'precision': 0.9098360655737705, 'recall': 0.9596541786743515, 'f1': 0.9340813464235624, 'number': 347}
- Date: {'precision': 0.9885057471264368, 'recall': 0.9913544668587896, 'f1': 0.9899280575539569, 'number': 347}
- Total: {'precision': 0.8571428571428571, 'recall': 0.8818443804034583, 'f1': 0.8693181818181818, 'number': 347}
- Overall Precision: 0.9125
- Overall Recall: 0.9395
- Overall F1: 0.9258
- Overall Accuracy: 0.9944
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Address | Company | Date | Total | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|
0.5199 | 1.0 | 40 | 0.1098 | {'precision': 0.8760330578512396, 'recall': 0.9164265129682997, 'f1': 0.895774647887324, 'number': 347} | {'precision': 0.7473958333333334, 'recall': 0.8270893371757925, 'f1': 0.7852257181942545, 'number': 347} | {'precision': 0.6711111111111111, 'recall': 0.8703170028818443, 'f1': 0.7578419071518193, 'number': 347} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 347} | 0.7577 | 0.6535 | 0.7017 | 0.9688 |
0.073 | 2.0 | 80 | 0.0471 | {'precision': 0.8736263736263736, 'recall': 0.9164265129682997, 'f1': 0.8945147679324893, 'number': 347} | {'precision': 0.8489583333333334, 'recall': 0.9394812680115274, 'f1': 0.8919288645690834, 'number': 347} | {'precision': 0.8909574468085106, 'recall': 0.9654178674351584, 'f1': 0.9266943291839556, 'number': 347} | {'precision': 0.5460385438972163, 'recall': 0.7348703170028819, 'f1': 0.6265356265356266, 'number': 347} | 0.7756 | 0.8890 | 0.8285 | 0.9874 |
0.0395 | 3.0 | 120 | 0.0368 | {'precision': 0.8781163434903048, 'recall': 0.9135446685878963, 'f1': 0.8954802259887006, 'number': 347} | {'precision': 0.8786279683377308, 'recall': 0.9596541786743515, 'f1': 0.9173553719008263, 'number': 347} | {'precision': 0.9713467048710601, 'recall': 0.9769452449567724, 'f1': 0.9741379310344828, 'number': 347} | {'precision': 0.694300518134715, 'recall': 0.7723342939481268, 'f1': 0.7312414733969987, 'number': 347} | 0.8522 | 0.9056 | 0.8781 | 0.9909 |
0.0278 | 4.0 | 160 | 0.0317 | {'precision': 0.9014084507042254, 'recall': 0.9221902017291066, 'f1': 0.9116809116809117, 'number': 347} | {'precision': 0.9175824175824175, 'recall': 0.962536023054755, 'f1': 0.9395218002812941, 'number': 347} | {'precision': 0.9827586206896551, 'recall': 0.9855907780979827, 'f1': 0.9841726618705036, 'number': 347} | {'precision': 0.680952380952381, 'recall': 0.8242074927953891, 'f1': 0.7457627118644069, 'number': 347} | 0.8621 | 0.9236 | 0.8918 | 0.9925 |
0.0221 | 5.0 | 200 | 0.0303 | {'precision': 0.8690807799442897, 'recall': 0.899135446685879, 'f1': 0.8838526912181303, 'number': 347} | {'precision': 0.9100817438692098, 'recall': 0.962536023054755, 'f1': 0.9355742296918769, 'number': 347} | {'precision': 0.991304347826087, 'recall': 0.9855907780979827, 'f1': 0.9884393063583815, 'number': 347} | {'precision': 0.706601466992665, 'recall': 0.8328530259365994, 'f1': 0.7645502645502645, 'number': 347} | 0.8628 | 0.9200 | 0.8905 | 0.9924 |
0.0178 | 6.0 | 240 | 0.0307 | {'precision': 0.9042253521126761, 'recall': 0.9250720461095101, 'f1': 0.9145299145299145, 'number': 347} | {'precision': 0.912568306010929, 'recall': 0.962536023054755, 'f1': 0.9368863955119215, 'number': 347} | {'precision': 0.985632183908046, 'recall': 0.9884726224783862, 'f1': 0.9870503597122302, 'number': 347} | {'precision': 0.7914438502673797, 'recall': 0.8530259365994236, 'f1': 0.8210818307905686, 'number': 347} | 0.8967 | 0.9323 | 0.9142 | 0.9935 |
0.0151 | 7.0 | 280 | 0.0290 | {'precision': 0.9019607843137255, 'recall': 0.9279538904899135, 'f1': 0.9147727272727272, 'number': 347} | {'precision': 0.927170868347339, 'recall': 0.9538904899135446, 'f1': 0.9403409090909091, 'number': 347} | {'precision': 0.9884726224783862, 'recall': 0.9884726224783862, 'f1': 0.9884726224783862, 'number': 347} | {'precision': 0.827683615819209, 'recall': 0.8443804034582133, 'f1': 0.8359486447931527, 'number': 347} | 0.9110 | 0.9287 | 0.9197 | 0.9941 |
0.0128 | 8.0 | 320 | 0.0305 | {'precision': 0.9042253521126761, 'recall': 0.9250720461095101, 'f1': 0.9145299145299145, 'number': 347} | {'precision': 0.9222222222222223, 'recall': 0.9567723342939481, 'f1': 0.9391796322489392, 'number': 347} | {'precision': 0.9715909090909091, 'recall': 0.9855907780979827, 'f1': 0.9785407725321887, 'number': 347} | {'precision': 0.8199445983379502, 'recall': 0.8530259365994236, 'f1': 0.8361581920903954, 'number': 347} | 0.9041 | 0.9301 | 0.9169 | 0.9940 |
0.0112 | 9.0 | 360 | 0.0333 | {'precision': 0.9129213483146067, 'recall': 0.9365994236311239, 'f1': 0.9246088193456615, 'number': 347} | {'precision': 0.940677966101695, 'recall': 0.9596541786743515, 'f1': 0.950071326676177, 'number': 347} | {'precision': 0.9884057971014493, 'recall': 0.9827089337175793, 'f1': 0.985549132947977, 'number': 347} | {'precision': 0.8842443729903537, 'recall': 0.792507204610951, 'f1': 0.8358662613981763, 'number': 347} | 0.9327 | 0.9179 | 0.9252 | 0.9940 |
0.0111 | 10.0 | 400 | 0.0318 | {'precision': 0.889196675900277, 'recall': 0.9250720461095101, 'f1': 0.9067796610169492, 'number': 347} | {'precision': 0.925, 'recall': 0.9596541786743515, 'f1': 0.942008486562942, 'number': 347} | {'precision': 0.9913544668587896, 'recall': 0.9913544668587896, 'f1': 0.9913544668587896, 'number': 347} | {'precision': 0.8596491228070176, 'recall': 0.8472622478386167, 'f1': 0.8534107402031931, 'number': 347} | 0.9163 | 0.9308 | 0.9235 | 0.9941 |
0.0089 | 11.0 | 440 | 0.0304 | {'precision': 0.8938547486033519, 'recall': 0.9221902017291066, 'f1': 0.9078014184397163, 'number': 347} | {'precision': 0.9279778393351801, 'recall': 0.9654178674351584, 'f1': 0.9463276836158191, 'number': 347} | {'precision': 0.9913544668587896, 'recall': 0.9913544668587896, 'f1': 0.9913544668587896, 'number': 347} | {'precision': 0.8691860465116279, 'recall': 0.861671469740634, 'f1': 0.8654124457308249, 'number': 347} | 0.9206 | 0.9352 | 0.9278 | 0.9947 |
0.0081 | 12.0 | 480 | 0.0320 | {'precision': 0.9098591549295775, 'recall': 0.930835734870317, 'f1': 0.9202279202279202, 'number': 347} | {'precision': 0.925207756232687, 'recall': 0.962536023054755, 'f1': 0.943502824858757, 'number': 347} | {'precision': 0.9913544668587896, 'recall': 0.9913544668587896, 'f1': 0.9913544668587896, 'number': 347} | {'precision': 0.84593837535014, 'recall': 0.8703170028818443, 'f1': 0.8579545454545454, 'number': 347} | 0.9176 | 0.9388 | 0.9281 | 0.9945 |
0.0071 | 13.0 | 520 | 0.0326 | {'precision': 0.9098591549295775, 'recall': 0.930835734870317, 'f1': 0.9202279202279202, 'number': 347} | {'precision': 0.919889502762431, 'recall': 0.9596541786743515, 'f1': 0.9393511988716503, 'number': 347} | {'precision': 0.9913544668587896, 'recall': 0.9913544668587896, 'f1': 0.9913544668587896, 'number': 347} | {'precision': 0.8487394957983193, 'recall': 0.8731988472622478, 'f1': 0.8607954545454546, 'number': 347} | 0.9170 | 0.9388 | 0.9277 | 0.9944 |
0.0071 | 14.0 | 560 | 0.0338 | {'precision': 0.8966480446927374, 'recall': 0.9250720461095101, 'f1': 0.9106382978723405, 'number': 347} | {'precision': 0.9098360655737705, 'recall': 0.9596541786743515, 'f1': 0.9340813464235624, 'number': 347} | {'precision': 0.9885057471264368, 'recall': 0.9913544668587896, 'f1': 0.9899280575539569, 'number': 347} | {'precision': 0.8653295128939829, 'recall': 0.8703170028818443, 'f1': 0.867816091954023, 'number': 347} | 0.9148 | 0.9366 | 0.9256 | 0.9943 |
0.0066 | 15.0 | 600 | 0.0338 | {'precision': 0.8966480446927374, 'recall': 0.9250720461095101, 'f1': 0.9106382978723405, 'number': 347} | {'precision': 0.9098360655737705, 'recall': 0.9596541786743515, 'f1': 0.9340813464235624, 'number': 347} | {'precision': 0.9885057471264368, 'recall': 0.9913544668587896, 'f1': 0.9899280575539569, 'number': 347} | {'precision': 0.8571428571428571, 'recall': 0.8818443804034583, 'f1': 0.8693181818181818, 'number': 347} | 0.9125 | 0.9395 | 0.9258 | 0.9944 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.13.3
- Downloads last month
- 53
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.