layoutlm-funsd / README.md
sachin18's picture
End of training
90076d7 verified
|
raw
history blame
9.37 kB
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7085
  • Answer: {'precision': 0.721081081081081, 'recall': 0.8244746600741656, 'f1': 0.7693194925028833, 'number': 809}
  • Header: {'precision': 0.3252032520325203, 'recall': 0.33613445378151263, 'f1': 0.3305785123966942, 'number': 119}
  • Question: {'precision': 0.7871772039180766, 'recall': 0.8300469483568075, 'f1': 0.8080438756855575, 'number': 1065}
  • Overall Precision: 0.7328
  • Overall Recall: 0.7983
  • Overall F1: 0.7642
  • Overall Accuracy: 0.8112

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.792 1.0 10 1.5932 {'precision': 0.03648648648648649, 'recall': 0.03337453646477132, 'f1': 0.034861200774693346, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3541114058355438, 'recall': 0.2507042253521127, 'f1': 0.29356789444749865, 'number': 1065} 0.1968 0.1475 0.1686 0.3760
1.4339 2.0 20 1.2410 {'precision': 0.2177121771217712, 'recall': 0.21878862793572312, 'f1': 0.218249075215783, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.43688639551192143, 'recall': 0.5849765258215962, 'f1': 0.5002007226013649, 'number': 1065} 0.3573 0.4014 0.3781 0.5877
1.0937 3.0 30 0.9505 {'precision': 0.45005149330587024, 'recall': 0.5401730531520396, 'f1': 0.4910112359550562, 'number': 809} {'precision': 0.045454545454545456, 'recall': 0.008403361344537815, 'f1': 0.014184397163120567, 'number': 119} {'precision': 0.6046141607000796, 'recall': 0.7136150234741784, 'f1': 0.6546080964685616, 'number': 1065} 0.5324 0.6011 0.5647 0.7057
0.835 4.0 40 0.7870 {'precision': 0.6255274261603375, 'recall': 0.7330037082818294, 'f1': 0.6750142287990893, 'number': 809} {'precision': 0.19298245614035087, 'recall': 0.09243697478991597, 'f1': 0.125, 'number': 119} {'precision': 0.6779220779220779, 'recall': 0.7352112676056338, 'f1': 0.7054054054054054, 'number': 1065} 0.6421 0.6959 0.6680 0.7601
0.6644 5.0 50 0.7063 {'precision': 0.6771739130434783, 'recall': 0.7700865265760197, 'f1': 0.7206477732793521, 'number': 809} {'precision': 0.2857142857142857, 'recall': 0.2184873949579832, 'f1': 0.24761904761904763, 'number': 119} {'precision': 0.6783161239078633, 'recall': 0.8018779342723005, 'f1': 0.7349397590361446, 'number': 1065} 0.6621 0.7541 0.7051 0.7872
0.5612 6.0 60 0.6880 {'precision': 0.6639593908629442, 'recall': 0.8084054388133498, 'f1': 0.7290969899665551, 'number': 809} {'precision': 0.26262626262626265, 'recall': 0.2184873949579832, 'f1': 0.23853211009174313, 'number': 119} {'precision': 0.7401229148375769, 'recall': 0.7915492957746478, 'f1': 0.76497277676951, 'number': 1065} 0.6851 0.7642 0.7225 0.7937
0.4819 7.0 70 0.6610 {'precision': 0.6937697993664202, 'recall': 0.8121137206427689, 'f1': 0.7482915717539863, 'number': 809} {'precision': 0.30097087378640774, 'recall': 0.2605042016806723, 'f1': 0.27927927927927926, 'number': 119} {'precision': 0.7568766637089619, 'recall': 0.8009389671361502, 'f1': 0.7782846715328468, 'number': 1065} 0.7079 0.7732 0.7391 0.8034
0.4299 8.0 80 0.6725 {'precision': 0.6850152905198776, 'recall': 0.830655129789864, 'f1': 0.7508379888268155, 'number': 809} {'precision': 0.2803738317757009, 'recall': 0.25210084033613445, 'f1': 0.2654867256637167, 'number': 119} {'precision': 0.7534364261168385, 'recall': 0.8234741784037559, 'f1': 0.7868999551368328, 'number': 1065} 0.7012 0.7923 0.7439 0.7950
0.3801 9.0 90 0.6654 {'precision': 0.7142857142857143, 'recall': 0.8158220024721878, 'f1': 0.7616849394114252, 'number': 809} {'precision': 0.3047619047619048, 'recall': 0.2689075630252101, 'f1': 0.28571428571428575, 'number': 119} {'precision': 0.7697715289982425, 'recall': 0.8225352112676056, 'f1': 0.7952791647753064, 'number': 1065} 0.7236 0.7868 0.7538 0.8092
0.3757 10.0 100 0.6709 {'precision': 0.7082452431289641, 'recall': 0.8281829419035847, 'f1': 0.7635327635327636, 'number': 809} {'precision': 0.34, 'recall': 0.2857142857142857, 'f1': 0.31050228310502287, 'number': 119} {'precision': 0.7769028871391076, 'recall': 0.8338028169014085, 'f1': 0.8043478260869565, 'number': 1065} 0.7273 0.7988 0.7614 0.8145
0.3165 11.0 110 0.6781 {'precision': 0.723726977248104, 'recall': 0.8257107540173053, 'f1': 0.7713625866050808, 'number': 809} {'precision': 0.3046875, 'recall': 0.3277310924369748, 'f1': 0.31578947368421056, 'number': 119} {'precision': 0.7736842105263158, 'recall': 0.828169014084507, 'f1': 0.7999999999999999, 'number': 1065} 0.7252 0.7973 0.7596 0.8077
0.2993 12.0 120 0.6894 {'precision': 0.71875, 'recall': 0.8244746600741656, 'f1': 0.7679907887161773, 'number': 809} {'precision': 0.3247863247863248, 'recall': 0.31932773109243695, 'f1': 0.3220338983050848, 'number': 119} {'precision': 0.7823008849557522, 'recall': 0.8300469483568075, 'f1': 0.8054669703872438, 'number': 1065} 0.7306 0.7973 0.7625 0.8117
0.2822 13.0 130 0.7039 {'precision': 0.7195652173913043, 'recall': 0.8182941903584673, 'f1': 0.7657605552342395, 'number': 809} {'precision': 0.3125, 'recall': 0.33613445378151263, 'f1': 0.3238866396761134, 'number': 119} {'precision': 0.7823008849557522, 'recall': 0.8300469483568075, 'f1': 0.8054669703872438, 'number': 1065} 0.7282 0.7958 0.7605 0.8095
0.2595 14.0 140 0.7045 {'precision': 0.72, 'recall': 0.823238566131026, 'f1': 0.7681660899653979, 'number': 809} {'precision': 0.3418803418803419, 'recall': 0.33613445378151263, 'f1': 0.3389830508474576, 'number': 119} {'precision': 0.7912578055307761, 'recall': 0.8328638497652582, 'f1': 0.8115279048490394, 'number': 1065} 0.7365 0.7993 0.7666 0.8118
0.2617 15.0 150 0.7085 {'precision': 0.721081081081081, 'recall': 0.8244746600741656, 'f1': 0.7693194925028833, 'number': 809} {'precision': 0.3252032520325203, 'recall': 0.33613445378151263, 'f1': 0.3305785123966942, 'number': 119} {'precision': 0.7871772039180766, 'recall': 0.8300469483568075, 'f1': 0.8080438756855575, 'number': 1065} 0.7328 0.7983 0.7642 0.8112

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1