layoutlm-funsd / README.md
sachin18's picture
End of training
0e476cd verified
|
raw
history blame
9.39 kB
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6746
  • Answer: {'precision': 0.7057569296375267, 'recall': 0.8182941903584673, 'f1': 0.7578706353749285, 'number': 809}
  • Header: {'precision': 0.3508771929824561, 'recall': 0.33613445378151263, 'f1': 0.34334763948497854, 'number': 119}
  • Question: {'precision': 0.7793345008756567, 'recall': 0.8356807511737089, 'f1': 0.8065246941549614, 'number': 1065}
  • Overall Precision: 0.7256
  • Overall Recall: 0.7988
  • Overall F1: 0.7604
  • Overall Accuracy: 0.8085

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8024 1.0 10 1.6086 {'precision': 0.009900990099009901, 'recall': 0.007416563658838072, 'f1': 0.008480565371024736, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.20625, 'recall': 0.12394366197183099, 'f1': 0.15483870967741936, 'number': 1065} 0.1108 0.0692 0.0852 0.3458
1.4593 2.0 20 1.2405 {'precision': 0.13250283125707815, 'recall': 0.1446229913473424, 'f1': 0.13829787234042556, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.42007168458781363, 'recall': 0.5502347417840375, 'f1': 0.4764227642276423, 'number': 1065} 0.3086 0.3527 0.3292 0.5822
1.1064 3.0 30 0.9251 {'precision': 0.46214355948869223, 'recall': 0.580964153275649, 'f1': 0.5147864184008761, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.550321199143469, 'recall': 0.723943661971831, 'f1': 0.6253041362530414, 'number': 1065} 0.5111 0.6227 0.5614 0.7118
0.8548 4.0 40 0.7690 {'precision': 0.5675413022351797, 'recall': 0.7218788627935723, 'f1': 0.6354733405875952, 'number': 809} {'precision': 0.02564102564102564, 'recall': 0.008403361344537815, 'f1': 0.012658227848101267, 'number': 119} {'precision': 0.6504534212695795, 'recall': 0.7408450704225352, 'f1': 0.6927129060579456, 'number': 1065} 0.6024 0.6894 0.6430 0.7602
0.6855 5.0 50 0.7230 {'precision': 0.6310572687224669, 'recall': 0.7082818294190358, 'f1': 0.6674432149097262, 'number': 809} {'precision': 0.2054794520547945, 'recall': 0.12605042016806722, 'f1': 0.15625, 'number': 119} {'precision': 0.6592818945760123, 'recall': 0.8103286384976526, 'f1': 0.7270429654591406, 'number': 1065} 0.6336 0.7280 0.6776 0.7785
0.5838 6.0 60 0.6791 {'precision': 0.6316297010607522, 'recall': 0.8096415327564895, 'f1': 0.7096424702058506, 'number': 809} {'precision': 0.25, 'recall': 0.15126050420168066, 'f1': 0.18848167539267013, 'number': 119} {'precision': 0.7356521739130435, 'recall': 0.7943661971830986, 'f1': 0.7638826185101579, 'number': 1065} 0.6724 0.7622 0.7145 0.7886
0.499 7.0 70 0.6482 {'precision': 0.6722689075630253, 'recall': 0.7911001236093943, 'f1': 0.7268597387847815, 'number': 809} {'precision': 0.30612244897959184, 'recall': 0.25210084033613445, 'f1': 0.2764976958525346, 'number': 119} {'precision': 0.7367972742759795, 'recall': 0.812206572769953, 'f1': 0.7726663689146941, 'number': 1065} 0.6902 0.7702 0.7280 0.8001
0.4429 8.0 80 0.6642 {'precision': 0.6596596596596597, 'recall': 0.8145859085290482, 'f1': 0.7289823008849557, 'number': 809} {'precision': 0.26605504587155965, 'recall': 0.24369747899159663, 'f1': 0.2543859649122807, 'number': 119} {'precision': 0.7445193929173693, 'recall': 0.8291079812206573, 'f1': 0.7845402043536206, 'number': 1065} 0.6848 0.7883 0.7329 0.7998
0.387 9.0 90 0.6536 {'precision': 0.6892177589852009, 'recall': 0.8059332509270705, 'f1': 0.743019943019943, 'number': 809} {'precision': 0.3269230769230769, 'recall': 0.2857142857142857, 'f1': 0.30493273542600896, 'number': 119} {'precision': 0.757912745936698, 'recall': 0.831924882629108, 'f1': 0.7931960608773501, 'number': 1065} 0.7084 0.7888 0.7464 0.8018
0.3798 10.0 100 0.6564 {'precision': 0.6893305439330544, 'recall': 0.8145859085290482, 'f1': 0.746742209631728, 'number': 809} {'precision': 0.3055555555555556, 'recall': 0.2773109243697479, 'f1': 0.2907488986784141, 'number': 119} {'precision': 0.7616580310880829, 'recall': 0.828169014084507, 'f1': 0.7935222672064778, 'number': 1065} 0.7084 0.7898 0.7469 0.8132
0.3185 11.0 110 0.6684 {'precision': 0.690700104493208, 'recall': 0.8170580964153276, 'f1': 0.7485843714609287, 'number': 809} {'precision': 0.3230769230769231, 'recall': 0.35294117647058826, 'f1': 0.3373493975903615, 'number': 119} {'precision': 0.761168384879725, 'recall': 0.831924882629108, 'f1': 0.7949753252579633, 'number': 1065} 0.7059 0.7973 0.7488 0.8018
0.3035 12.0 120 0.6603 {'precision': 0.69989281886388, 'recall': 0.8071693448702101, 'f1': 0.7497129735935705, 'number': 809} {'precision': 0.336283185840708, 'recall': 0.31932773109243695, 'f1': 0.32758620689655166, 'number': 119} {'precision': 0.7688966116420504, 'recall': 0.8309859154929577, 'f1': 0.7987364620938627, 'number': 1065} 0.7173 0.7908 0.7523 0.8129
0.2848 13.0 130 0.6748 {'precision': 0.695697796432319, 'recall': 0.8195302843016069, 'f1': 0.7525539160045404, 'number': 809} {'precision': 0.3474576271186441, 'recall': 0.3445378151260504, 'f1': 0.3459915611814346, 'number': 119} {'precision': 0.7705061082024433, 'recall': 0.8291079812206573, 'f1': 0.798733604703754, 'number': 1065} 0.7158 0.7963 0.7539 0.8063
0.2628 14.0 140 0.6744 {'precision': 0.7089151450053706, 'recall': 0.8158220024721878, 'f1': 0.7586206896551725, 'number': 809} {'precision': 0.358974358974359, 'recall': 0.35294117647058826, 'f1': 0.35593220338983056, 'number': 119} {'precision': 0.7739965095986039, 'recall': 0.8328638497652582, 'f1': 0.8023518769787427, 'number': 1065} 0.7242 0.7973 0.7590 0.8092
0.262 15.0 150 0.6746 {'precision': 0.7057569296375267, 'recall': 0.8182941903584673, 'f1': 0.7578706353749285, 'number': 809} {'precision': 0.3508771929824561, 'recall': 0.33613445378151263, 'f1': 0.34334763948497854, 'number': 119} {'precision': 0.7793345008756567, 'recall': 0.8356807511737089, 'f1': 0.8065246941549614, 'number': 1065} 0.7256 0.7988 0.7604 0.8085

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1