File size: 5,920 Bytes
99cadef 87f64fd 99cadef 87f64fd 99cadef 70444f2 99cadef 87da47d 99cadef e03cc43 e40b7d2 be5d319 3c4998e 90ee01f b06e91a 8ef9e0c 7b3aec5 c2c673d 2b3bbff bf56d98 135ac8e f246fbf e662e64 d887222 32086df d4bc870 c95b241 1887dd4 9966a60 0afdb14 10de20c 327c839 a76eebc c5e86ad a64b977 4414885 e5bcbd9 70444f2 99cadef 87da47d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 |
---
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: layoutlm-funsd-tf
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# layoutlm-funsd-tf
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0961
- Validation Loss: 1.5766
- Train Overall Precision: 0.5302
- Train Overall Recall: 0.6121
- Train Overall F1: 0.5682
- Train Overall Accuracy: 0.6392
- Epoch: 31
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Train Overall Precision | Train Overall Recall | Train Overall F1 | Train Overall Accuracy | Epoch |
|:----------:|:---------------:|:-----------------------:|:--------------------:|:----------------:|:----------------------:|:-----:|
| 1.7401 | 1.5464 | 0.0930 | 0.1174 | 0.1038 | 0.3843 | 0 |
| 1.4833 | 1.3301 | 0.2414 | 0.3964 | 0.3000 | 0.4268 | 1 |
| 1.2693 | 1.2622 | 0.2985 | 0.4947 | 0.3724 | 0.4693 | 2 |
| 1.1369 | 1.0729 | 0.3617 | 0.4887 | 0.4157 | 0.5902 | 3 |
| 1.0364 | 1.1800 | 0.3293 | 0.5073 | 0.3994 | 0.5604 | 4 |
| 0.9327 | 1.2033 | 0.3938 | 0.5268 | 0.4507 | 0.5683 | 5 |
| 0.8211 | 1.0876 | 0.4192 | 0.5153 | 0.4623 | 0.6004 | 6 |
| 0.7265 | 1.0982 | 0.4480 | 0.5334 | 0.4869 | 0.6102 | 7 |
| 0.6561 | 1.1134 | 0.4490 | 0.5650 | 0.5003 | 0.6192 | 8 |
| 0.5783 | 1.0834 | 0.4764 | 0.5630 | 0.5161 | 0.6317 | 9 |
| 0.5160 | 1.1453 | 0.4504 | 0.5494 | 0.4950 | 0.6227 | 10 |
| 0.4714 | 1.1865 | 0.4873 | 0.5981 | 0.5371 | 0.6277 | 11 |
| 0.4340 | 1.2212 | 0.4972 | 0.5805 | 0.5356 | 0.6318 | 12 |
| 0.3990 | 1.2407 | 0.4913 | 0.6212 | 0.5486 | 0.6334 | 13 |
| 0.3743 | 1.2597 | 0.5173 | 0.5986 | 0.5550 | 0.6338 | 14 |
| 0.3454 | 1.2205 | 0.5157 | 0.6106 | 0.5592 | 0.6406 | 15 |
| 0.3276 | 1.3600 | 0.5186 | 0.6001 | 0.5564 | 0.6318 | 16 |
| 0.3013 | 1.6473 | 0.4805 | 0.5745 | 0.5233 | 0.5899 | 17 |
| 0.3093 | 1.2595 | 0.4957 | 0.5735 | 0.5318 | 0.6389 | 18 |
| 0.2577 | 1.4449 | 0.4772 | 0.5675 | 0.5185 | 0.6076 | 19 |
| 0.2301 | 1.4514 | 0.4790 | 0.5620 | 0.5172 | 0.6205 | 20 |
| 0.2118 | 1.4575 | 0.5255 | 0.5991 | 0.5599 | 0.6305 | 21 |
| 0.1845 | 1.4446 | 0.5270 | 0.6076 | 0.5644 | 0.6353 | 22 |
| 0.1698 | 1.4538 | 0.5428 | 0.6011 | 0.5705 | 0.6423 | 23 |
| 0.1606 | 1.4318 | 0.5131 | 0.5720 | 0.5409 | 0.6361 | 24 |
| 0.1538 | 1.4257 | 0.5310 | 0.6061 | 0.5661 | 0.6484 | 25 |
| 0.1403 | 1.5233 | 0.5232 | 0.6061 | 0.5616 | 0.6428 | 26 |
| 0.1229 | 1.4796 | 0.5547 | 0.6131 | 0.5825 | 0.6471 | 27 |
| 0.1225 | 1.5841 | 0.5239 | 0.5946 | 0.5570 | 0.6101 | 28 |
| 0.1085 | 1.5432 | 0.5253 | 0.6046 | 0.5622 | 0.6423 | 29 |
| 0.1025 | 1.5414 | 0.5176 | 0.5966 | 0.5543 | 0.6312 | 30 |
| 0.0961 | 1.5766 | 0.5302 | 0.6121 | 0.5682 | 0.6392 | 31 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.17.0
- Tokenizers 0.15.2
|