layoutlm-captive-corp-7
This model is a fine-tuned version of microsoft/layoutlmv3-base on the layoutlmv3 dataset. It achieves the following results on the evaluation set:
- Loss: 0.7263
- Precision: 0.8352
- Recall: 0.8352
- F1: 0.8352
- Accuracy: 0.8812
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.3675 | 1.0 | 4 | 0.9283 | 0.7847 | 0.8052 | 0.7948 | 0.8423 |
0.3555 | 2.0 | 8 | 0.9010 | 0.7897 | 0.8015 | 0.7955 | 0.8510 |
0.3296 | 3.0 | 12 | 0.8976 | 0.7941 | 0.8090 | 0.8015 | 0.8402 |
0.3007 | 4.0 | 16 | 0.8562 | 0.7955 | 0.8015 | 0.7985 | 0.8488 |
0.2795 | 5.0 | 20 | 0.8642 | 0.7889 | 0.7978 | 0.7933 | 0.8467 |
0.2604 | 6.0 | 24 | 0.8300 | 0.7810 | 0.8015 | 0.7911 | 0.8531 |
0.2464 | 7.0 | 28 | 0.8305 | 0.7904 | 0.8052 | 0.7978 | 0.8488 |
0.2295 | 8.0 | 32 | 0.8205 | 0.7868 | 0.8015 | 0.7941 | 0.8423 |
0.2127 | 9.0 | 36 | 0.8065 | 0.7897 | 0.8015 | 0.7955 | 0.8488 |
0.1961 | 10.0 | 40 | 0.7998 | 0.8060 | 0.8090 | 0.8075 | 0.8531 |
0.185 | 11.0 | 44 | 0.7846 | 0.7941 | 0.8090 | 0.8015 | 0.8553 |
0.1756 | 12.0 | 48 | 0.7772 | 0.8015 | 0.8165 | 0.8089 | 0.8575 |
0.1636 | 13.0 | 52 | 0.7772 | 0.8118 | 0.8240 | 0.8178 | 0.8618 |
0.1567 | 14.0 | 56 | 0.7785 | 0.7978 | 0.8127 | 0.8052 | 0.8575 |
0.1474 | 15.0 | 60 | 0.7634 | 0.7941 | 0.8090 | 0.8015 | 0.8618 |
0.1383 | 16.0 | 64 | 0.7435 | 0.8007 | 0.8127 | 0.8067 | 0.8618 |
0.1313 | 17.0 | 68 | 0.7465 | 0.7978 | 0.8127 | 0.8052 | 0.8618 |
0.1231 | 18.0 | 72 | 0.7526 | 0.8051 | 0.8202 | 0.8126 | 0.8661 |
0.1197 | 19.0 | 76 | 0.7386 | 0.8015 | 0.8165 | 0.8089 | 0.8683 |
0.1143 | 20.0 | 80 | 0.7419 | 0.8044 | 0.8165 | 0.8104 | 0.8661 |
0.108 | 21.0 | 84 | 0.7442 | 0.8037 | 0.8127 | 0.8082 | 0.8575 |
0.1025 | 22.0 | 88 | 0.7415 | 0.8044 | 0.8165 | 0.8104 | 0.8618 |
0.0994 | 23.0 | 92 | 0.7392 | 0.8015 | 0.8165 | 0.8089 | 0.8639 |
0.0941 | 24.0 | 96 | 0.7372 | 0.7985 | 0.8165 | 0.8074 | 0.8639 |
0.0905 | 25.0 | 100 | 0.7424 | 0.8044 | 0.8165 | 0.8104 | 0.8639 |
0.0879 | 26.0 | 104 | 0.7349 | 0.8007 | 0.8127 | 0.8067 | 0.8618 |
0.0842 | 27.0 | 108 | 0.7296 | 0.7978 | 0.8127 | 0.8052 | 0.8639 |
0.0805 | 28.0 | 112 | 0.7339 | 0.7970 | 0.8090 | 0.8030 | 0.8596 |
0.0785 | 29.0 | 116 | 0.7405 | 0.8 | 0.8090 | 0.8045 | 0.8553 |
0.0759 | 30.0 | 120 | 0.7424 | 0.7970 | 0.8090 | 0.8030 | 0.8596 |
0.0726 | 31.0 | 124 | 0.7329 | 0.8044 | 0.8165 | 0.8104 | 0.8596 |
0.0703 | 32.0 | 128 | 0.7289 | 0.8015 | 0.8165 | 0.8089 | 0.8618 |
0.0681 | 33.0 | 132 | 0.7204 | 0.8022 | 0.8202 | 0.8111 | 0.8661 |
0.0663 | 34.0 | 136 | 0.7168 | 0.8015 | 0.8165 | 0.8089 | 0.8639 |
0.0619 | 35.0 | 140 | 0.7244 | 0.7978 | 0.8127 | 0.8052 | 0.8661 |
0.0614 | 36.0 | 144 | 0.7360 | 0.7978 | 0.8127 | 0.8052 | 0.8618 |
0.0594 | 37.0 | 148 | 0.7306 | 0.8044 | 0.8165 | 0.8104 | 0.8618 |
0.0586 | 38.0 | 152 | 0.7177 | 0.8081 | 0.8202 | 0.8141 | 0.8639 |
0.0562 | 39.0 | 156 | 0.7133 | 0.8088 | 0.8240 | 0.8163 | 0.8704 |
0.0557 | 40.0 | 160 | 0.7229 | 0.7978 | 0.8127 | 0.8052 | 0.8639 |
0.0558 | 41.0 | 164 | 0.7244 | 0.8044 | 0.8165 | 0.8104 | 0.8661 |
0.0513 | 42.0 | 168 | 0.7180 | 0.8044 | 0.8165 | 0.8104 | 0.8704 |
0.0515 | 43.0 | 172 | 0.7166 | 0.8007 | 0.8127 | 0.8067 | 0.8661 |
0.0496 | 44.0 | 176 | 0.7186 | 0.8104 | 0.8165 | 0.8134 | 0.8683 |
0.049 | 45.0 | 180 | 0.7165 | 0.8111 | 0.8202 | 0.8156 | 0.8661 |
0.0488 | 46.0 | 184 | 0.7139 | 0.8111 | 0.8202 | 0.8156 | 0.8683 |
0.0463 | 47.0 | 188 | 0.7199 | 0.8015 | 0.8165 | 0.8089 | 0.8639 |
0.0449 | 48.0 | 192 | 0.7257 | 0.8015 | 0.8165 | 0.8089 | 0.8618 |
0.0452 | 49.0 | 196 | 0.7231 | 0.8015 | 0.8165 | 0.8089 | 0.8618 |
0.0437 | 50.0 | 200 | 0.7161 | 0.8081 | 0.8202 | 0.8141 | 0.8683 |
0.0428 | 51.0 | 204 | 0.7125 | 0.8015 | 0.8165 | 0.8089 | 0.8683 |
0.0423 | 52.0 | 208 | 0.7170 | 0.8104 | 0.8165 | 0.8134 | 0.8661 |
0.0417 | 53.0 | 212 | 0.7242 | 0.8141 | 0.8202 | 0.8172 | 0.8661 |
0.0397 | 54.0 | 216 | 0.7269 | 0.8141 | 0.8202 | 0.8172 | 0.8661 |
0.0399 | 55.0 | 220 | 0.7239 | 0.8044 | 0.8165 | 0.8104 | 0.8661 |
0.0384 | 56.0 | 224 | 0.7246 | 0.8044 | 0.8165 | 0.8104 | 0.8704 |
0.0378 | 57.0 | 228 | 0.7237 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
0.0367 | 58.0 | 232 | 0.7200 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
0.037 | 59.0 | 236 | 0.7183 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
0.0356 | 60.0 | 240 | 0.7187 | 0.7978 | 0.8127 | 0.8052 | 0.8683 |
0.035 | 61.0 | 244 | 0.7161 | 0.8007 | 0.8127 | 0.8067 | 0.8661 |
0.0347 | 62.0 | 248 | 0.7148 | 0.8104 | 0.8165 | 0.8134 | 0.8683 |
0.0339 | 63.0 | 252 | 0.7195 | 0.8141 | 0.8202 | 0.8172 | 0.8704 |
0.0337 | 64.0 | 256 | 0.7236 | 0.8141 | 0.8202 | 0.8172 | 0.8704 |
0.033 | 65.0 | 260 | 0.7208 | 0.8246 | 0.8277 | 0.8262 | 0.8747 |
0.0324 | 66.0 | 264 | 0.7159 | 0.8178 | 0.8240 | 0.8209 | 0.8747 |
0.032 | 67.0 | 268 | 0.7122 | 0.8118 | 0.8240 | 0.8178 | 0.8747 |
0.0315 | 68.0 | 272 | 0.7116 | 0.8229 | 0.8352 | 0.8290 | 0.8790 |
0.0313 | 69.0 | 276 | 0.7154 | 0.8125 | 0.8277 | 0.8200 | 0.8747 |
0.0307 | 70.0 | 280 | 0.7197 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
0.0306 | 71.0 | 284 | 0.7214 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
0.0302 | 72.0 | 288 | 0.7223 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
0.0289 | 73.0 | 292 | 0.7191 | 0.8192 | 0.8315 | 0.8253 | 0.8769 |
0.0292 | 74.0 | 296 | 0.7170 | 0.8185 | 0.8277 | 0.8231 | 0.8769 |
0.0284 | 75.0 | 300 | 0.7172 | 0.8148 | 0.8240 | 0.8194 | 0.8747 |
0.0283 | 76.0 | 304 | 0.7196 | 0.8111 | 0.8202 | 0.8156 | 0.8726 |
0.0277 | 77.0 | 308 | 0.7212 | 0.8044 | 0.8165 | 0.8104 | 0.8704 |
0.0278 | 78.0 | 312 | 0.7230 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
0.027 | 79.0 | 316 | 0.7223 | 0.8178 | 0.8240 | 0.8209 | 0.8747 |
0.0269 | 80.0 | 320 | 0.7190 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
0.0267 | 81.0 | 324 | 0.7187 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
0.0262 | 82.0 | 328 | 0.7224 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
0.0261 | 83.0 | 332 | 0.7262 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
0.0259 | 84.0 | 336 | 0.7279 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
0.0257 | 85.0 | 340 | 0.7274 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
0.0251 | 86.0 | 344 | 0.7267 | 0.8141 | 0.8202 | 0.8172 | 0.8726 |
0.0252 | 87.0 | 348 | 0.7252 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
0.024 | 88.0 | 352 | 0.7251 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0242 | 89.0 | 356 | 0.7263 | 0.8352 | 0.8352 | 0.8352 | 0.8812 |
0.0244 | 90.0 | 360 | 0.7268 | 0.8352 | 0.8352 | 0.8352 | 0.8812 |
0.0241 | 91.0 | 364 | 0.7275 | 0.8352 | 0.8352 | 0.8352 | 0.8812 |
0.0233 | 92.0 | 368 | 0.7279 | 0.8315 | 0.8315 | 0.8315 | 0.8790 |
0.0233 | 93.0 | 372 | 0.7301 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
0.0231 | 94.0 | 376 | 0.7315 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
0.0234 | 95.0 | 380 | 0.7322 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
0.023 | 96.0 | 384 | 0.7304 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
0.0222 | 97.0 | 388 | 0.7270 | 0.8209 | 0.8240 | 0.8224 | 0.8747 |
0.0221 | 98.0 | 392 | 0.7237 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
0.0223 | 99.0 | 396 | 0.7216 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
0.0222 | 100.0 | 400 | 0.7210 | 0.8246 | 0.8277 | 0.8262 | 0.8769 |
0.0218 | 101.0 | 404 | 0.7224 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0223 | 102.0 | 408 | 0.7235 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0216 | 103.0 | 412 | 0.7239 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0213 | 104.0 | 416 | 0.7248 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0213 | 105.0 | 420 | 0.7284 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0212 | 106.0 | 424 | 0.7316 | 0.8246 | 0.8277 | 0.8262 | 0.8747 |
0.0207 | 107.0 | 428 | 0.7331 | 0.8246 | 0.8277 | 0.8262 | 0.8747 |
0.0206 | 108.0 | 432 | 0.7329 | 0.8315 | 0.8315 | 0.8315 | 0.8769 |
0.0208 | 109.0 | 436 | 0.7325 | 0.8352 | 0.8352 | 0.8352 | 0.8790 |
0.0208 | 110.0 | 440 | 0.7315 | 0.8315 | 0.8315 | 0.8315 | 0.8769 |
0.0205 | 111.0 | 444 | 0.7305 | 0.8352 | 0.8352 | 0.8352 | 0.8790 |
0.0202 | 112.0 | 448 | 0.7284 | 0.8352 | 0.8352 | 0.8352 | 0.8790 |
0.0202 | 113.0 | 452 | 0.7284 | 0.8253 | 0.8315 | 0.8284 | 0.8790 |
0.0201 | 114.0 | 456 | 0.7285 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
0.0199 | 115.0 | 460 | 0.7308 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
0.0196 | 116.0 | 464 | 0.7346 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
0.0197 | 117.0 | 468 | 0.7371 | 0.8284 | 0.8315 | 0.8299 | 0.8747 |
0.0195 | 118.0 | 472 | 0.7384 | 0.8284 | 0.8315 | 0.8299 | 0.8747 |
0.0197 | 119.0 | 476 | 0.7383 | 0.8284 | 0.8315 | 0.8299 | 0.8747 |
0.0198 | 120.0 | 480 | 0.7379 | 0.8284 | 0.8315 | 0.8299 | 0.8769 |
0.0194 | 121.0 | 484 | 0.7372 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
0.0192 | 122.0 | 488 | 0.7364 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
0.0192 | 123.0 | 492 | 0.7350 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
0.0192 | 124.0 | 496 | 0.7344 | 0.8216 | 0.8277 | 0.8246 | 0.8747 |
0.0189 | 125.0 | 500 | 0.7340 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
0.0185 | 126.0 | 504 | 0.7343 | 0.8216 | 0.8277 | 0.8246 | 0.8769 |
0.0184 | 127.0 | 508 | 0.7343 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0188 | 128.0 | 512 | 0.7334 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0188 | 129.0 | 516 | 0.7328 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0187 | 130.0 | 520 | 0.7331 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0188 | 131.0 | 524 | 0.7324 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0188 | 132.0 | 528 | 0.7325 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0187 | 133.0 | 532 | 0.7317 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0186 | 134.0 | 536 | 0.7310 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0184 | 135.0 | 540 | 0.7308 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0182 | 136.0 | 544 | 0.7307 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.018 | 137.0 | 548 | 0.7312 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0181 | 138.0 | 552 | 0.7324 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0182 | 139.0 | 556 | 0.7333 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0182 | 140.0 | 560 | 0.7344 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0184 | 141.0 | 564 | 0.7351 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0182 | 142.0 | 568 | 0.7358 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0179 | 143.0 | 572 | 0.7358 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0178 | 144.0 | 576 | 0.7357 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0183 | 145.0 | 580 | 0.7356 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0183 | 146.0 | 584 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0177 | 147.0 | 588 | 0.7354 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0179 | 148.0 | 592 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0183 | 149.0 | 596 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
0.0179 | 150.0 | 600 | 0.7355 | 0.8284 | 0.8315 | 0.8299 | 0.8790 |
Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 116
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for jfrish/layoutlm-captive-corp-7
Base model
microsoft/layoutlmv3-baseEvaluation results
- Precision on layoutlmv3test set self-reported0.835
- Recall on layoutlmv3test set self-reported0.835
- F1 on layoutlmv3test set self-reported0.835
- Accuracy on layoutlmv3test set self-reported0.881