layoutlmv2-finetuned-cord_500

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the cord-layoutlmv3 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2207
  • Menu.cnt Precision: 1.0
  • Menu.cnt Recall: 0.9867
  • Menu.cnt F1: 0.9933
  • Menu.cnt Number: 225
  • Menu.discountprice Precision: 0.8889
  • Menu.discountprice Recall: 0.8
  • Menu.discountprice F1: 0.8421
  • Menu.discountprice Number: 10
  • Menu.etc Precision: 0.0
  • Menu.etc Recall: 0.0
  • Menu.etc F1: 0.0
  • Menu.etc Number: 3
  • Menu.itemsubtotal Precision: 0.0
  • Menu.itemsubtotal Recall: 0.0
  • Menu.itemsubtotal F1: 0.0
  • Menu.itemsubtotal Number: 6
  • Menu.nm Precision: 0.9764
  • Menu.nm Recall: 0.9880
  • Menu.nm F1: 0.9822
  • Menu.nm Number: 251
  • Menu.num Precision: 0.8462
  • Menu.num Recall: 1.0
  • Menu.num F1: 0.9167
  • Menu.num Number: 11
  • Menu.price Precision: 0.9723
  • Menu.price Recall: 0.9919
  • Menu.price F1: 0.9820
  • Menu.price Number: 248
  • Menu.sub Cnt Precision: 0.85
  • Menu.sub Cnt Recall: 1.0
  • Menu.sub Cnt F1: 0.9189
  • Menu.sub Cnt Number: 17
  • Menu.sub Nm Precision: 0.8421
  • Menu.sub Nm Recall: 0.8649
  • Menu.sub Nm F1: 0.8533
  • Menu.sub Nm Number: 37
  • Menu.sub Price Precision: 0.95
  • Menu.sub Price Recall: 0.95
  • Menu.sub Price F1: 0.9500
  • Menu.sub Price Number: 20
  • Menu.unitprice Precision: 0.9855
  • Menu.unitprice Recall: 0.9855
  • Menu.unitprice F1: 0.9855
  • Menu.unitprice Number: 69
  • Sub Total.discount Price Precision: 0.8571
  • Sub Total.discount Price Recall: 0.8571
  • Sub Total.discount Price F1: 0.8571
  • Sub Total.discount Price Number: 7
  • Sub Total.etc Precision: 0.9231
  • Sub Total.etc Recall: 0.9231
  • Sub Total.etc F1: 0.9231
  • Sub Total.etc Number: 13
  • Sub Total.service Price Precision: 1.0
  • Sub Total.service Price Recall: 1.0
  • Sub Total.service Price F1: 1.0
  • Sub Total.service Price Number: 12
  • Sub Total.subtotal Price Precision: 0.9714
  • Sub Total.subtotal Price Recall: 0.9855
  • Sub Total.subtotal Price F1: 0.9784
  • Sub Total.subtotal Price Number: 69
  • Sub Total.tax Price Precision: 1.0
  • Sub Total.tax Price Recall: 1.0
  • Sub Total.tax Price F1: 1.0
  • Sub Total.tax Price Number: 47
  • Total.cashprice Precision: 1.0
  • Total.cashprice Recall: 0.9167
  • Total.cashprice F1: 0.9565
  • Total.cashprice Number: 72
  • Total.changeprice Precision: 0.9672
  • Total.changeprice Recall: 1.0
  • Total.changeprice F1: 0.9833
  • Total.changeprice Number: 59
  • Total.creditcardprice Precision: 1.0
  • Total.creditcardprice Recall: 0.9412
  • Total.creditcardprice F1: 0.9697
  • Total.creditcardprice Number: 17
  • Total.emoneyprice Precision: 0.1667
  • Total.emoneyprice Recall: 0.5
  • Total.emoneyprice F1: 0.25
  • Total.emoneyprice Number: 2
  • Total.menuqty Cnt Precision: 0.9667
  • Total.menuqty Cnt Recall: 1.0
  • Total.menuqty Cnt F1: 0.9831
  • Total.menuqty Cnt Number: 29
  • Total.menutype Cnt Precision: 1.0
  • Total.menutype Cnt Recall: 0.7143
  • Total.menutype Cnt F1: 0.8333
  • Total.menutype Cnt Number: 7
  • Total.total Etc Precision: 0.0
  • Total.total Etc Recall: 0.0
  • Total.total Etc F1: 0.0
  • Total.total Etc Number: 4
  • Total.total Price Precision: 0.9709
  • Total.total Price Recall: 0.9901
  • Total.total Price F1: 0.9804
  • Total.total Price Number: 101
  • Overall Precision: 0.9627
  • Overall Recall: 0.9671
  • Overall F1: 0.9649
  • Overall Accuracy: 0.9690

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 3000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Menu.cnt Precision Menu.cnt Recall Menu.cnt F1 Menu.cnt Number Menu.discountprice Precision Menu.discountprice Recall Menu.discountprice F1 Menu.discountprice Number Menu.etc Precision Menu.etc Recall Menu.etc F1 Menu.etc Number Menu.itemsubtotal Precision Menu.itemsubtotal Recall Menu.itemsubtotal F1 Menu.itemsubtotal Number Menu.nm Precision Menu.nm Recall Menu.nm F1 Menu.nm Number Menu.num Precision Menu.num Recall Menu.num F1 Menu.num Number Menu.price Precision Menu.price Recall Menu.price F1 Menu.price Number Menu.sub Cnt Precision Menu.sub Cnt Recall Menu.sub Cnt F1 Menu.sub Cnt Number Menu.sub Nm Precision Menu.sub Nm Recall Menu.sub Nm F1 Menu.sub Nm Number Menu.sub Price Precision Menu.sub Price Recall Menu.sub Price F1 Menu.sub Price Number Menu.unitprice Precision Menu.unitprice Recall Menu.unitprice F1 Menu.unitprice Number Sub Total.discount Price Precision Sub Total.discount Price Recall Sub Total.discount Price F1 Sub Total.discount Price Number Sub Total.etc Precision Sub Total.etc Recall Sub Total.etc F1 Sub Total.etc Number Sub Total.service Price Precision Sub Total.service Price Recall Sub Total.service Price F1 Sub Total.service Price Number Sub Total.subtotal Price Precision Sub Total.subtotal Price Recall Sub Total.subtotal Price F1 Sub Total.subtotal Price Number Sub Total.tax Price Precision Sub Total.tax Price Recall Sub Total.tax Price F1 Sub Total.tax Price Number Total.cashprice Precision Total.cashprice Recall Total.cashprice F1 Total.cashprice Number Total.changeprice Precision Total.changeprice Recall Total.changeprice F1 Total.changeprice Number Total.creditcardprice Precision Total.creditcardprice Recall Total.creditcardprice F1 Total.creditcardprice Number Total.emoneyprice Precision Total.emoneyprice Recall Total.emoneyprice F1 Total.emoneyprice Number Total.menuqty Cnt Precision Total.menuqty Cnt Recall Total.menuqty Cnt F1 Total.menuqty Cnt Number Total.menutype Cnt Precision Total.menutype Cnt Recall Total.menutype Cnt F1 Total.menutype Cnt Number Total.total Etc Precision Total.total Etc Recall Total.total Etc F1 Total.total Etc Number Total.total Price Precision Total.total Price Recall Total.total Price F1 Total.total Price Number Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 2.0 250 2.5018 0.85 0.9822 0.9113 225 0.0 0.0 0.0 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.8257 1.0 0.9045 251 0.0 0.0 0.0 11 0.8746 0.9839 0.9260 248 0.0 0.0 0.0 17 0.0 0.0 0.0 37 0.0 0.0 0.0 20 0.9296 0.9565 0.9429 69 0.0 0.0 0.0 7 0.0 0.0 0.0 13 0.0 0.0 0.0 12 0.8108 0.8696 0.8392 69 0.4719 0.8936 0.6176 47 0.7683 0.875 0.8182 72 0.8 0.8814 0.8387 59 0.0 0.0 0.0 17 0.0 0.0 0.0 2 0.4138 0.4138 0.4138 29 0.0 0.0 0.0 7 0.0 0.0 0.0 4 0.7323 0.9208 0.8158 101 0.7983 0.8263 0.8121 0.8255
2.6537 4.0 500 1.3952 0.8805 0.9822 0.9286 225 0.7273 0.8 0.7619 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.8817 0.9801 0.9283 251 1.0 1.0 1.0 11 0.8547 0.9960 0.9199 248 0.0 0.0 0.0 17 0.4444 0.1081 0.1739 37 0.0 0.0 0.0 20 0.8608 0.9855 0.9189 69 0.0 0.0 0.0 7 0.0 0.0 0.0 13 0.3438 0.9167 0.5 12 0.8919 0.9565 0.9231 69 0.88 0.9362 0.9072 47 1.0 0.875 0.9333 72 0.9483 0.9322 0.9402 59 0.6522 0.8824 0.75 17 0.0 0.0 0.0 2 0.8286 1.0 0.9062 29 0.0 0.0 0.0 7 0.0 0.0 0.0 4 0.9684 0.9109 0.9388 101 0.8632 0.8832 0.8731 0.8947
2.6537 6.0 750 0.7646 0.9170 0.9822 0.9485 225 0.5556 0.5 0.5263 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9537 0.9841 0.9686 251 1.0 1.0 1.0 11 0.9385 0.9839 0.9606 248 0.0 0.0 0.0 17 0.8 0.8649 0.8312 37 1.0 0.55 0.7097 20 0.9306 0.9710 0.9504 69 0.75 0.8571 0.8000 7 0.6667 0.7692 0.7143 13 0.8571 1.0 0.9231 12 0.9067 0.9855 0.9444 69 0.9787 0.9787 0.9787 47 1.0 0.9167 0.9565 72 0.9516 1.0 0.9752 59 0.7619 0.9412 0.8421 17 0.0 0.0 0.0 2 0.7632 1.0 0.8657 29 0.0 0.0 0.0 7 0.0 0.0 0.0 4 0.97 0.9604 0.9652 101 0.9244 0.9334 0.9289 0.9435
0.8368 8.0 1000 0.4986 0.9567 0.9822 0.9693 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9764 0.9880 0.9822 251 0.7333 1.0 0.8462 11 0.9648 0.9960 0.9802 248 1.0 0.6471 0.7857 17 0.8718 0.9189 0.8947 37 1.0 0.85 0.9189 20 0.9718 1.0 0.9857 69 0.5556 0.7143 0.6250 7 0.8889 0.6154 0.7273 13 1.0 1.0 1.0 12 0.8831 0.9855 0.9315 69 1.0 0.9787 0.9892 47 1.0 0.8889 0.9412 72 0.9831 0.9831 0.9831 59 0.5333 0.9412 0.6809 17 0.0 0.0 0.0 2 0.8056 1.0 0.8923 29 0.0 0.0 0.0 7 0.0 0.0 0.0 4 0.9694 0.9406 0.9548 101 0.9420 0.9484 0.9452 0.9520
0.8368 10.0 1250 0.3597 0.9528 0.9867 0.9694 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9688 0.9880 0.9783 251 0.7333 1.0 0.8462 11 0.9462 0.9919 0.9685 248 1.0 0.5294 0.6923 17 0.825 0.8919 0.8571 37 1.0 0.65 0.7879 20 0.9718 1.0 0.9857 69 1.0 1.0 1.0 7 0.8667 1.0 0.9286 13 1.0 1.0 1.0 12 0.9324 1.0 0.9650 69 1.0 0.9787 0.9892 47 1.0 0.9306 0.9640 72 0.9516 1.0 0.9752 59 0.8889 0.9412 0.9143 17 0.25 0.5 0.3333 2 0.9667 1.0 0.9831 29 1.0 0.7143 0.8333 7 0.0 0.0 0.0 4 0.9898 0.9604 0.9749 101 0.9524 0.9581 0.9552 0.9660
0.3287 12.0 1500 0.3021 0.9864 0.9644 0.9753 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9839 0.9761 0.98 251 0.7333 1.0 0.8462 11 0.9755 0.9637 0.9696 248 0.7727 1.0 0.8718 17 0.7556 0.9189 0.8293 37 0.7917 0.95 0.8636 20 0.9855 0.9855 0.9855 69 1.0 1.0 1.0 7 0.8667 1.0 0.9286 13 1.0 1.0 1.0 12 0.8947 0.9855 0.9379 69 1.0 0.9787 0.9892 47 1.0 0.9306 0.9640 72 0.9516 1.0 0.9752 59 0.8889 0.9412 0.9143 17 0.5 1.0 0.6667 2 0.9667 1.0 0.9831 29 1.0 0.7143 0.8333 7 0.0 0.0 0.0 4 0.9802 0.9802 0.9802 101 0.9553 0.9588 0.9570 0.9652
0.3287 14.0 1750 0.2756 0.9825 0.9956 0.9890 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9650 0.9880 0.9764 251 0.9167 1.0 0.9565 11 0.9762 0.9919 0.9840 248 1.0 0.8824 0.9375 17 0.8889 0.8649 0.8767 37 0.95 0.95 0.9500 20 0.9855 0.9855 0.9855 69 0.875 1.0 0.9333 7 0.9091 0.7692 0.8333 13 1.0 1.0 1.0 12 0.9189 0.9855 0.9510 69 1.0 0.9787 0.9892 47 1.0 0.9306 0.9640 72 0.9516 1.0 0.9752 59 0.9412 0.9412 0.9412 17 0.3333 0.5 0.4 2 0.9667 1.0 0.9831 29 1.0 0.7143 0.8333 7 0.0 0.0 0.0 4 0.9612 0.9802 0.9706 101 0.9648 0.9656 0.9652 0.9656
0.1835 16.0 2000 0.2440 0.9955 0.9867 0.9911 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9688 0.9880 0.9783 251 0.9167 1.0 0.9565 11 0.9762 0.9919 0.9840 248 0.85 1.0 0.9189 17 0.8684 0.8919 0.88 37 1.0 0.95 0.9744 20 0.9853 0.9710 0.9781 69 1.0 1.0 1.0 7 0.9286 1.0 0.9630 13 1.0 1.0 1.0 12 0.9444 0.9855 0.9645 69 1.0 0.9787 0.9892 47 0.9851 0.9167 0.9496 72 0.9672 1.0 0.9833 59 0.9412 0.9412 0.9412 17 0.4 1.0 0.5714 2 0.9667 1.0 0.9831 29 1.0 0.7143 0.8333 7 0.0 0.0 0.0 4 0.9712 1.0 0.9854 101 0.9679 0.9693 0.9686 0.9720
0.1835 18.0 2250 0.2300 0.9912 0.9956 0.9933 225 0.8 0.8 0.8000 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9764 0.9880 0.9822 251 0.7857 1.0 0.88 11 0.9762 0.9919 0.9840 248 0.9444 1.0 0.9714 17 0.8205 0.8649 0.8421 37 0.95 0.95 0.9500 20 0.9855 0.9855 0.9855 69 0.8571 0.8571 0.8571 7 0.9231 0.9231 0.9231 13 1.0 1.0 1.0 12 0.9577 0.9855 0.9714 69 1.0 1.0 1.0 47 1.0 0.9028 0.9489 72 0.9672 1.0 0.9833 59 0.9412 0.9412 0.9412 17 0.1667 0.5 0.25 2 0.9667 1.0 0.9831 29 1.0 0.7143 0.8333 7 0.0 0.0 0.0 4 0.9709 0.9901 0.9804 101 0.9628 0.9678 0.9653 0.9690
0.1239 20.0 2500 0.2151 1.0 0.9867 0.9933 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9724 0.9841 0.9782 251 0.8462 1.0 0.9167 11 0.98 0.9879 0.9839 248 0.85 1.0 0.9189 17 0.85 0.9189 0.8831 37 0.8636 0.95 0.9048 20 0.9855 0.9855 0.9855 69 0.8571 0.8571 0.8571 7 0.9231 0.9231 0.9231 13 1.0 1.0 1.0 12 0.9444 0.9855 0.9645 69 1.0 0.9787 0.9892 47 1.0 0.9167 0.9565 72 0.9672 1.0 0.9833 59 0.9412 0.9412 0.9412 17 0.1667 0.5 0.25 2 1.0 0.9655 0.9825 29 0.8571 0.8571 0.8571 7 0.0 0.0 0.0 4 0.9709 0.9901 0.9804 101 0.9620 0.9663 0.9642 0.9690
0.1239 22.0 2750 nan 1.0 0.9778 0.9888 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9723 0.9801 0.9762 251 0.8462 1.0 0.9167 11 0.9721 0.9839 0.9780 248 0.85 1.0 0.9189 17 0.8611 0.8378 0.8493 37 0.95 0.95 0.9500 20 0.9851 0.9565 0.9706 69 0.8333 0.7143 0.7692 7 0.9231 0.9231 0.9231 13 1.0 1.0 1.0 12 0.9710 0.9710 0.9710 69 1.0 1.0 1.0 47 1.0 0.9028 0.9489 72 0.9667 0.9831 0.9748 59 1.0 0.9412 0.9697 17 0.1667 0.5 0.25 2 1.0 1.0 1.0 29 1.0 0.8571 0.9231 7 0.0 0.0 0.0 4 0.9706 0.9802 0.9754 101 0.9624 0.9573 0.9598 0.9575
0.1008 24.0 3000 0.2207 1.0 0.9867 0.9933 225 0.8889 0.8 0.8421 10 0.0 0.0 0.0 3 0.0 0.0 0.0 6 0.9764 0.9880 0.9822 251 0.8462 1.0 0.9167 11 0.9723 0.9919 0.9820 248 0.85 1.0 0.9189 17 0.8421 0.8649 0.8533 37 0.95 0.95 0.9500 20 0.9855 0.9855 0.9855 69 0.8571 0.8571 0.8571 7 0.9231 0.9231 0.9231 13 1.0 1.0 1.0 12 0.9714 0.9855 0.9784 69 1.0 1.0 1.0 47 1.0 0.9167 0.9565 72 0.9672 1.0 0.9833 59 1.0 0.9412 0.9697 17 0.1667 0.5 0.25 2 0.9667 1.0 0.9831 29 1.0 0.7143 0.8333 7 0.0 0.0 0.0 4 0.9709 0.9901 0.9804 101 0.9627 0.9671 0.9649 0.9690

Framework versions

  • Transformers 4.21.2
  • Pytorch 1.10.0+cu111
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.