layoutlmv3-fintuned-funsd

This model is a fine-tuned version of microsoft/layoutlmv3-base on the funsd-layoutlmv3 dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7000
  • Precision: 0.9130
  • Recall: 0.9180
  • F1: 0.9155
  • Accuracy: 0.8387

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 20000

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.33 100 0.6462 0.7683 0.8137 0.7903 0.7808
No log 2.67 200 0.4416 0.8248 0.8887 0.8556 0.8444
No log 4.0 300 0.5276 0.8563 0.9061 0.8805 0.8351
No log 5.33 400 0.5038 0.8319 0.8922 0.8610 0.8317
0.5539 6.67 500 0.5489 0.8864 0.9151 0.9005 0.8583
0.5539 8.0 600 0.5947 0.8646 0.9165 0.8898 0.8383
0.5539 9.33 700 0.7367 0.8796 0.8857 0.8827 0.8362
0.5539 10.67 800 0.8299 0.8911 0.9230 0.9068 0.8350
0.5539 12.0 900 0.6754 0.8934 0.9116 0.9024 0.8465
0.1203 13.33 1000 0.8242 0.8814 0.9011 0.8912 0.8420
0.1203 14.67 1100 0.9349 0.8835 0.8857 0.8846 0.8208
0.1203 16.0 1200 1.0205 0.8853 0.8783 0.8818 0.8131
0.1203 17.33 1300 0.8790 0.8865 0.8962 0.8913 0.8542
0.1203 18.67 1400 0.9262 0.8870 0.8967 0.8918 0.8482
0.0522 20.0 1500 0.9744 0.8979 0.8952 0.8965 0.8323
0.0522 21.33 1600 1.0198 0.8976 0.9146 0.9060 0.8401
0.0522 22.67 1700 1.0466 0.9114 0.9146 0.9130 0.8368
0.0522 24.0 1800 0.9874 0.8944 0.9086 0.9014 0.8462
0.0522 25.33 1900 1.1240 0.9051 0.9141 0.9095 0.8364
0.0208 26.67 2000 1.0658 0.9021 0.9205 0.9112 0.8399
0.0208 28.0 2100 1.2349 0.8934 0.9116 0.9024 0.8227
0.0208 29.33 2200 1.2906 0.8930 0.9081 0.9005 0.8111
0.0208 30.67 2300 1.2133 0.9020 0.9096 0.9058 0.8398
0.0208 32.0 2400 1.2202 0.9055 0.9096 0.9076 0.8375
0.0154 33.33 2500 1.2454 0.8909 0.9131 0.9019 0.8393
0.0154 34.67 2600 1.2065 0.9058 0.9126 0.9092 0.8364
0.0154 36.0 2700 1.2165 0.8998 0.9096 0.9046 0.8284
0.0154 37.33 2800 1.3195 0.8913 0.9160 0.9035 0.8272
0.0154 38.67 2900 1.3834 0.8957 0.9086 0.9021 0.8240
0.007 40.0 3000 1.3035 0.9001 0.9086 0.9043 0.8306
0.007 41.33 3100 1.3553 0.8991 0.9121 0.9055 0.8241
0.007 42.67 3200 1.3227 0.9054 0.9036 0.9045 0.8343
0.007 44.0 3300 1.2750 0.9262 0.9170 0.9216 0.8460
0.007 45.33 3400 1.2563 0.8980 0.9101 0.9040 0.8408
0.0035 46.67 3500 1.2013 0.9015 0.9136 0.9075 0.8383
0.0035 48.0 3600 1.2035 0.8997 0.9180 0.9088 0.8487
0.0035 49.33 3700 1.3997 0.9206 0.9041 0.9123 0.8312
0.0035 50.67 3800 1.3818 0.9117 0.9180 0.9149 0.8337
0.0035 52.0 3900 1.3568 0.9045 0.9086 0.9066 0.8410
0.0057 53.33 4000 1.3255 0.8935 0.9170 0.9051 0.8313
0.0057 54.67 4100 1.2954 0.8950 0.9151 0.9049 0.8343
0.0057 56.0 4200 1.4201 0.9028 0.9141 0.9084 0.8234
0.0057 57.33 4300 1.3858 0.9016 0.9061 0.9039 0.8238
0.0057 58.67 4400 1.3748 0.8976 0.9056 0.9016 0.8286
0.0155 60.0 4500 1.4346 0.9024 0.9051 0.9038 0.8274
0.0155 61.33 4600 1.4059 0.9073 0.9136 0.9104 0.8324
0.0155 62.67 4700 1.3779 0.9076 0.9121 0.9098 0.8358
0.0155 64.0 4800 1.3785 0.9064 0.9235 0.9149 0.8389
0.0155 65.33 4900 1.3995 0.9014 0.9220 0.9116 0.8190
0.0043 66.67 5000 1.2618 0.9087 0.9101 0.9094 0.8329
0.0043 68.0 5100 1.2878 0.9093 0.9165 0.9129 0.8526
0.0043 69.33 5200 1.4384 0.9063 0.9126 0.9094 0.8372
0.0043 70.67 5300 1.5029 0.9093 0.9165 0.9129 0.8347
0.0043 72.0 5400 1.4592 0.9107 0.9071 0.9089 0.8345
0.0045 73.33 5500 1.4604 0.9099 0.9136 0.9118 0.8263
0.0045 74.67 5600 1.5632 0.8933 0.9066 0.8999 0.8162
0.0045 76.0 5700 1.5839 0.9114 0.9096 0.9105 0.8335
0.0045 77.33 5800 1.5557 0.9099 0.9081 0.9090 0.8392
0.0045 78.67 5900 1.4348 0.9024 0.9051 0.9038 0.8266
0.0089 80.0 6000 1.2747 0.9026 0.9160 0.9093 0.8429
0.0089 81.33 6100 1.3560 0.8963 0.9230 0.9094 0.8376
0.0089 82.67 6200 1.2859 0.8987 0.9165 0.9075 0.8480
0.0089 84.0 6300 1.3389 0.9032 0.9220 0.9125 0.8315
0.0089 85.33 6400 1.3922 0.8922 0.9126 0.9023 0.8413
0.0045 86.67 6500 1.3723 0.9003 0.9066 0.9035 0.8353
0.0045 88.0 6600 1.3220 0.9011 0.9141 0.9075 0.8441
0.0045 89.33 6700 1.4433 0.9063 0.9126 0.9094 0.8213
0.0045 90.67 6800 1.5350 0.8977 0.9195 0.9085 0.8266
0.0045 92.0 6900 1.3681 0.9115 0.9215 0.9165 0.8467
0.002 93.33 7000 1.3141 0.8981 0.9240 0.9109 0.8398
0.002 94.67 7100 1.3836 0.9069 0.9146 0.9107 0.8310
0.002 96.0 7200 1.4456 0.8995 0.9155 0.9074 0.8310
0.002 97.33 7300 1.4218 0.9018 0.9031 0.9025 0.8304
0.002 98.67 7400 1.5428 0.8958 0.9225 0.9090 0.8252
0.0016 100.0 7500 1.4982 0.8974 0.9215 0.9093 0.8280
0.0016 101.33 7600 1.4787 0.9025 0.9240 0.9131 0.8330
0.0016 102.67 7700 1.5694 0.8982 0.9026 0.9004 0.8148
0.0016 104.0 7800 1.4361 0.8985 0.9146 0.9065 0.8244
0.0016 105.33 7900 1.5643 0.8912 0.9280 0.9092 0.8265
0.0065 106.67 8000 1.5890 0.9017 0.9155 0.9086 0.8269
0.0065 108.0 8100 1.5755 0.8901 0.9170 0.9034 0.8209
0.0065 109.33 8200 1.7716 0.9006 0.9051 0.9029 0.8105
0.0065 110.67 8300 1.6814 0.8973 0.9160 0.9066 0.8168
0.0065 112.0 8400 1.5316 0.9002 0.9230 0.9115 0.8199
0.0012 113.33 8500 1.5376 0.9041 0.9136 0.9088 0.8338
0.0012 114.67 8600 1.5773 0.9085 0.9175 0.9130 0.8334
0.0012 116.0 8700 1.5998 0.9050 0.9086 0.9068 0.8380
0.0012 117.33 8800 1.6401 0.8985 0.9101 0.9042 0.8298
0.0012 118.67 8900 1.6894 0.9055 0.8897 0.8975 0.8248
0.0011 120.0 9000 1.6945 0.9004 0.9031 0.9018 0.8275
0.0011 121.33 9100 1.5659 0.9038 0.9240 0.9138 0.8332
0.0011 122.67 9200 1.5270 0.8947 0.9240 0.9091 0.8358
0.0011 124.0 9300 1.5225 0.9081 0.9230 0.9155 0.8304
0.0011 125.33 9400 1.6064 0.8906 0.9141 0.9022 0.8232
0.0035 126.67 9500 1.5898 0.9034 0.9240 0.9136 0.8294
0.0035 128.0 9600 1.5404 0.8949 0.9225 0.9085 0.8336
0.0035 129.33 9700 1.4890 0.9074 0.9250 0.9161 0.8460
0.0035 130.67 9800 1.5620 0.9049 0.9175 0.9112 0.8315
0.0035 132.0 9900 1.5565 0.9050 0.9180 0.9115 0.8279
0.0014 133.33 10000 1.5553 0.8989 0.9230 0.9108 0.8424
0.0014 134.67 10100 1.5287 0.9060 0.9195 0.9127 0.8356
0.0014 136.0 10200 1.5282 0.9109 0.9146 0.9127 0.8398
0.0014 137.33 10300 1.5280 0.9073 0.9141 0.9107 0.8437
0.0014 138.67 10400 1.5719 0.9092 0.9151 0.9121 0.8387
0.0035 140.0 10500 1.5059 0.9074 0.9155 0.9115 0.8426
0.0035 141.33 10600 1.5702 0.9013 0.9250 0.9130 0.8355
0.0035 142.67 10700 1.5080 0.9035 0.9260 0.9146 0.8455
0.0035 144.0 10800 1.4643 0.9097 0.9255 0.9175 0.8467
0.0035 145.33 10900 1.5316 0.9037 0.9230 0.9132 0.8387
0.0011 146.67 11000 1.5314 0.9114 0.9195 0.9154 0.8392
0.0011 148.0 11100 1.4988 0.9114 0.9200 0.9157 0.8493
0.0011 149.33 11200 1.4546 0.9121 0.9275 0.9197 0.8538
0.0011 150.67 11300 1.5075 0.9062 0.9170 0.9116 0.8456
0.0011 152.0 11400 1.4556 0.8973 0.9076 0.9024 0.8393
0.0009 153.33 11500 1.5058 0.8911 0.9185 0.9046 0.8272
0.0009 154.67 11600 1.5903 0.9197 0.9101 0.9149 0.8318
0.0009 156.0 11700 1.5263 0.9164 0.9146 0.9155 0.8413
0.0009 157.33 11800 1.5729 0.9129 0.9160 0.9145 0.8386
0.0009 158.67 11900 1.5880 0.9086 0.9131 0.9108 0.8398
0.0009 160.0 12000 1.5907 0.9090 0.9126 0.9108 0.8399
0.0009 161.33 12100 1.5714 0.9111 0.9111 0.9111 0.8373
0.0009 162.67 12200 1.5848 0.9135 0.9126 0.9130 0.8378
0.0009 164.0 12300 1.5816 0.9112 0.9175 0.9144 0.8405
0.0009 165.33 12400 1.5425 0.9080 0.9121 0.9100 0.8386
0.0 166.67 12500 1.5837 0.9046 0.9136 0.9090 0.8362
0.0 168.0 12600 1.6781 0.9025 0.9195 0.9109 0.8290
0.0 169.33 12700 1.6219 0.9028 0.9185 0.9106 0.8326
0.0 170.67 12800 1.5786 0.9076 0.9126 0.9101 0.8380
0.0 172.0 12900 1.6212 0.9020 0.9146 0.9082 0.8322
0.0018 173.33 13000 1.6451 0.9086 0.9141 0.9113 0.8315
0.0018 174.67 13100 1.6730 0.9064 0.9185 0.9124 0.8293
0.0018 176.0 13200 1.6106 0.9026 0.9071 0.9049 0.8354
0.0018 177.33 13300 1.6403 0.9081 0.9180 0.9130 0.8402
0.0018 178.67 13400 1.6343 0.9043 0.9200 0.9121 0.8361
0.0012 180.0 13500 1.5853 0.9096 0.9195 0.9145 0.8431
0.0012 181.33 13600 1.5859 0.9101 0.9205 0.9153 0.8432
0.0012 182.67 13700 1.6137 0.9071 0.9215 0.9142 0.8394
0.0012 184.0 13800 1.6416 0.9002 0.9185 0.9093 0.8299
0.0012 185.33 13900 1.5497 0.9085 0.9126 0.9105 0.8457
0.0002 186.67 14000 1.6534 0.9015 0.9141 0.9077 0.8322
0.0002 188.0 14100 1.6003 0.9044 0.9116 0.9080 0.8290
0.0002 189.33 14200 1.5269 0.9046 0.9185 0.9115 0.8443
0.0002 190.67 14300 1.5977 0.9069 0.9141 0.9104 0.8360
0.0002 192.0 14400 1.5968 0.9090 0.9131 0.9110 0.8374
0.0004 193.33 14500 1.5945 0.9090 0.9131 0.9110 0.8376
0.0004 194.67 14600 1.6041 0.9117 0.9126 0.9121 0.8388
0.0004 196.0 14700 1.6038 0.9071 0.9121 0.9096 0.8362
0.0004 197.33 14800 1.6108 0.9059 0.9131 0.9095 0.8347
0.0004 198.67 14900 1.5873 0.9087 0.9151 0.9119 0.8393
0.0 200.0 15000 1.6047 0.9042 0.9146 0.9094 0.8417
0.0 201.33 15100 1.6125 0.9017 0.9155 0.9086 0.8348
0.0 202.67 15200 1.6191 0.9039 0.9155 0.9097 0.8360
0.0 204.0 15300 1.6626 0.9095 0.9131 0.9113 0.8320
0.0 205.33 15400 1.5967 0.9051 0.9190 0.9120 0.8427
0.0003 206.67 15500 1.5989 0.8982 0.9116 0.9048 0.8318
0.0003 208.0 15600 1.5990 0.8995 0.9116 0.9055 0.8307
0.0003 209.33 15700 1.6338 0.9043 0.9111 0.9077 0.8325
0.0003 210.67 15800 1.6390 0.9034 0.9101 0.9067 0.8329
0.0003 212.0 15900 1.6372 0.9015 0.9096 0.9055 0.8368
0.0001 213.33 16000 1.6020 0.9045 0.9131 0.9088 0.8378
0.0001 214.67 16100 1.5761 0.9071 0.9170 0.9121 0.8397
0.0001 216.0 16200 1.6536 0.9010 0.9131 0.9070 0.8293
0.0001 217.33 16300 1.6549 0.9023 0.9131 0.9077 0.8290
0.0001 218.67 16400 1.6737 0.8948 0.9081 0.9014 0.8292
0.0002 220.0 16500 1.6918 0.9106 0.9155 0.9131 0.8402
0.0002 221.33 16600 1.6726 0.9102 0.9165 0.9134 0.8379
0.0002 222.67 16700 1.6962 0.9121 0.9175 0.9148 0.8369
0.0002 224.0 16800 1.6974 0.9038 0.9146 0.9091 0.8367
0.0002 225.33 16900 1.7147 0.9126 0.9185 0.9156 0.8376
0.0006 226.67 17000 1.7000 0.9130 0.9180 0.9155 0.8387
0.0006 228.0 17100 1.6951 0.9083 0.9155 0.9119 0.8374
0.0006 229.33 17200 1.7014 0.9097 0.9160 0.9129 0.8369
0.0006 230.67 17300 1.7029 0.9102 0.9165 0.9134 0.8369
0.0006 232.0 17400 1.7039 0.9112 0.9170 0.9141 0.8374
0.0 233.33 17500 1.6516 0.9157 0.9116 0.9136 0.8355
0.0 234.67 17600 1.6536 0.9148 0.9126 0.9137 0.8348
0.0 236.0 17700 1.6548 0.9144 0.9131 0.9137 0.8348
0.0 237.33 17800 1.7110 0.9068 0.9185 0.9126 0.8360
0.0 238.67 17900 1.7115 0.9073 0.9185 0.9129 0.8362
0.0 240.0 18000 1.7124 0.9054 0.9180 0.9117 0.8362
0.0 241.33 18100 1.7146 0.9072 0.9175 0.9123 0.8376
0.0 242.67 18200 1.7217 0.9100 0.9141 0.9120 0.8317
0.0 244.0 18300 1.7225 0.9096 0.9146 0.9121 0.8315
0.0 245.33 18400 1.7159 0.9070 0.9155 0.9112 0.8323
0.0001 246.67 18500 1.7164 0.9074 0.9155 0.9115 0.8322
0.0001 248.0 18600 1.6927 0.9009 0.9165 0.9086 0.8326
0.0001 249.33 18700 1.6767 0.9034 0.9155 0.9094 0.8335
0.0001 250.67 18800 1.6773 0.9034 0.9155 0.9094 0.8335
0.0001 252.0 18900 1.6885 0.9029 0.9151 0.9090 0.8334
0.0002 253.33 19000 1.7032 0.9053 0.9165 0.9109 0.8312
0.0002 254.67 19100 1.7036 0.9057 0.9160 0.9108 0.8307
0.0002 256.0 19200 1.7041 0.9053 0.9160 0.9106 0.8310
0.0002 257.33 19300 1.7045 0.9053 0.9160 0.9106 0.8310
0.0002 258.67 19400 1.7049 0.9053 0.9160 0.9106 0.8310
0.0 260.0 19500 1.7069 0.9057 0.9165 0.9111 0.8310
0.0 261.33 19600 1.7062 0.9076 0.9170 0.9123 0.8312
0.0 262.67 19700 1.7071 0.9071 0.9170 0.9121 0.8312
0.0 264.0 19800 1.7083 0.9067 0.9170 0.9118 0.8313
0.0 265.33 19900 1.7084 0.9058 0.9170 0.9114 0.8316
0.0 266.67 20000 1.7086 0.9058 0.9170 0.9114 0.8316

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
20
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for diegokauer/layoutlmv3-fintuned-funsd

Finetuned
(217)
this model

Evaluation results