--- license: apache-2.0 tags: - generated_from_keras_callback model-index: - name: cptanalatriste/request-for-help results: [] --- # cptanalatriste/request-for-help This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.1342 - Train Sparse Categorical Accuracy: 1.0 - Validation Loss: 0.1514 - Validation Sparse Categorical Accuracy: 0.9796 - Epoch: 19 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'learning_rate': 3e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False} - training_precision: float32 ### Training results | Train Loss | Train Sparse Categorical Accuracy | Validation Loss | Validation Sparse Categorical Accuracy | Epoch | |:----------:|:---------------------------------:|:---------------:|:--------------------------------------:|:-----:| | 0.8291 | 0.375 | 0.7483 | 0.3673 | 0 | | 0.7470 | 0.375 | 0.6302 | 0.8163 | 1 | | 0.6504 | 0.625 | 0.6079 | 0.9184 | 2 | | 0.6128 | 0.7812 | 0.5882 | 0.8980 | 3 | | 0.5939 | 0.8125 | 0.5639 | 0.9184 | 4 | | 0.5300 | 0.9688 | 0.5378 | 0.9184 | 5 | | 0.5306 | 0.9688 | 0.5098 | 0.9388 | 6 | | 0.4963 | 1.0 | 0.4806 | 0.9388 | 7 | | 0.4683 | 0.9688 | 0.4434 | 0.9592 | 8 | | 0.3959 | 1.0 | 0.4070 | 0.9796 | 9 | | 0.3807 | 1.0 | 0.3762 | 0.9796 | 10 | | 0.3509 | 1.0 | 0.3439 | 0.9796 | 11 | | 0.3013 | 1.0 | 0.3064 | 0.9796 | 12 | | 0.2848 | 1.0 | 0.2931 | 0.9796 | 13 | | 0.2587 | 1.0 | 0.2681 | 0.9796 | 14 | | 0.2510 | 1.0 | 0.2295 | 0.9796 | 15 | | 0.1867 | 1.0 | 0.2000 | 0.9796 | 16 | | 0.1652 | 1.0 | 0.1793 | 0.9796 | 17 | | 0.1297 | 1.0 | 0.1637 | 0.9796 | 18 | | 0.1342 | 1.0 | 0.1514 | 0.9796 | 19 | ### Framework versions - Transformers 4.17.0 - TensorFlow 2.6.2 - Datasets 1.18.4 - Tokenizers 0.11.6