|
--- |
|
license: mit |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: dbmdz_convbert-base-turkish-mc4-cased_allnli_tr |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# dbmdz_convbert-base-turkish-mc4-cased_allnli_tr |
|
|
|
This model is a fine-tuned version of [dbmdz/convbert-base-turkish-mc4-cased](https://huggingface.co/dbmdz/convbert-base-turkish-mc4-cased) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.5541 |
|
- Accuracy: 0.8111 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 3 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-----:|:-----:|:---------------:|:--------:| |
|
| 0.7338 | 0.03 | 1000 | 0.6722 | 0.7236 | |
|
| 0.603 | 0.07 | 2000 | 0.6465 | 0.7399 | |
|
| 0.5605 | 0.1 | 3000 | 0.5801 | 0.7728 | |
|
| 0.55 | 0.14 | 4000 | 0.5994 | 0.7626 | |
|
| 0.529 | 0.17 | 5000 | 0.5720 | 0.7697 | |
|
| 0.5196 | 0.2 | 6000 | 0.5692 | 0.7769 | |
|
| 0.5117 | 0.24 | 7000 | 0.5725 | 0.7785 | |
|
| 0.5044 | 0.27 | 8000 | 0.5532 | 0.7787 | |
|
| 0.5016 | 0.31 | 9000 | 0.5546 | 0.7812 | |
|
| 0.5031 | 0.34 | 10000 | 0.5461 | 0.7870 | |
|
| 0.4949 | 0.37 | 11000 | 0.5725 | 0.7826 | |
|
| 0.4894 | 0.41 | 12000 | 0.5419 | 0.7933 | |
|
| 0.4796 | 0.44 | 13000 | 0.5278 | 0.7914 | |
|
| 0.4795 | 0.48 | 14000 | 0.5193 | 0.7953 | |
|
| 0.4713 | 0.51 | 15000 | 0.5534 | 0.7771 | |
|
| 0.4738 | 0.54 | 16000 | 0.5098 | 0.8039 | |
|
| 0.481 | 0.58 | 17000 | 0.5244 | 0.7958 | |
|
| 0.4634 | 0.61 | 18000 | 0.5215 | 0.7972 | |
|
| 0.465 | 0.65 | 19000 | 0.5129 | 0.7985 | |
|
| 0.4624 | 0.68 | 20000 | 0.5062 | 0.8047 | |
|
| 0.4597 | 0.71 | 21000 | 0.5114 | 0.8029 | |
|
| 0.4571 | 0.75 | 22000 | 0.5070 | 0.8073 | |
|
| 0.4602 | 0.78 | 23000 | 0.5115 | 0.7993 | |
|
| 0.4552 | 0.82 | 24000 | 0.5085 | 0.8052 | |
|
| 0.4538 | 0.85 | 25000 | 0.5118 | 0.7974 | |
|
| 0.4517 | 0.88 | 26000 | 0.5036 | 0.8044 | |
|
| 0.4517 | 0.92 | 27000 | 0.4930 | 0.8062 | |
|
| 0.4413 | 0.95 | 28000 | 0.5307 | 0.7964 | |
|
| 0.4483 | 0.99 | 29000 | 0.5195 | 0.7938 | |
|
| 0.4036 | 1.02 | 30000 | 0.5238 | 0.8029 | |
|
| 0.3724 | 1.05 | 31000 | 0.5125 | 0.8082 | |
|
| 0.3777 | 1.09 | 32000 | 0.5099 | 0.8075 | |
|
| 0.3753 | 1.12 | 33000 | 0.5172 | 0.8053 | |
|
| 0.367 | 1.15 | 34000 | 0.5188 | 0.8053 | |
|
| 0.3819 | 1.19 | 35000 | 0.5218 | 0.8046 | |
|
| 0.363 | 1.22 | 36000 | 0.5202 | 0.7993 | |
|
| 0.3794 | 1.26 | 37000 | 0.5240 | 0.8048 | |
|
| 0.3749 | 1.29 | 38000 | 0.5026 | 0.8054 | |
|
| 0.367 | 1.32 | 39000 | 0.5198 | 0.8075 | |
|
| 0.3759 | 1.36 | 40000 | 0.5298 | 0.7993 | |
|
| 0.3701 | 1.39 | 41000 | 0.5072 | 0.8091 | |
|
| 0.3742 | 1.43 | 42000 | 0.5071 | 0.8098 | |
|
| 0.3706 | 1.46 | 43000 | 0.5317 | 0.8037 | |
|
| 0.3716 | 1.49 | 44000 | 0.5034 | 0.8052 | |
|
| 0.3717 | 1.53 | 45000 | 0.5258 | 0.8012 | |
|
| 0.3714 | 1.56 | 46000 | 0.5195 | 0.8050 | |
|
| 0.3781 | 1.6 | 47000 | 0.5004 | 0.8104 | |
|
| 0.3725 | 1.63 | 48000 | 0.5124 | 0.8113 | |
|
| 0.3624 | 1.66 | 49000 | 0.5040 | 0.8094 | |
|
| 0.3657 | 1.7 | 50000 | 0.4979 | 0.8111 | |
|
| 0.3669 | 1.73 | 51000 | 0.4968 | 0.8100 | |
|
| 0.3636 | 1.77 | 52000 | 0.5075 | 0.8079 | |
|
| 0.36 | 1.8 | 53000 | 0.4985 | 0.8110 | |
|
| 0.3624 | 1.83 | 54000 | 0.5125 | 0.8070 | |
|
| 0.366 | 1.87 | 55000 | 0.4918 | 0.8117 | |
|
| 0.3655 | 1.9 | 56000 | 0.5051 | 0.8109 | |
|
| 0.3609 | 1.94 | 57000 | 0.5083 | 0.8105 | |
|
| 0.3672 | 1.97 | 58000 | 0.5129 | 0.8085 | |
|
| 0.3545 | 2.0 | 59000 | 0.5467 | 0.8109 | |
|
| 0.2938 | 2.04 | 60000 | 0.5635 | 0.8049 | |
|
| 0.29 | 2.07 | 61000 | 0.5781 | 0.8041 | |
|
| 0.2992 | 2.11 | 62000 | 0.5470 | 0.8077 | |
|
| 0.2957 | 2.14 | 63000 | 0.5765 | 0.8073 | |
|
| 0.292 | 2.17 | 64000 | 0.5472 | 0.8106 | |
|
| 0.2893 | 2.21 | 65000 | 0.5590 | 0.8085 | |
|
| 0.2883 | 2.24 | 66000 | 0.5535 | 0.8064 | |
|
| 0.2923 | 2.28 | 67000 | 0.5508 | 0.8095 | |
|
| 0.2868 | 2.31 | 68000 | 0.5679 | 0.8098 | |
|
| 0.2892 | 2.34 | 69000 | 0.5660 | 0.8057 | |
|
| 0.292 | 2.38 | 70000 | 0.5494 | 0.8088 | |
|
| 0.286 | 2.41 | 71000 | 0.5653 | 0.8085 | |
|
| 0.2939 | 2.45 | 72000 | 0.5673 | 0.8070 | |
|
| 0.286 | 2.48 | 73000 | 0.5600 | 0.8092 | |
|
| 0.2844 | 2.51 | 74000 | 0.5508 | 0.8095 | |
|
| 0.2913 | 2.55 | 75000 | 0.5645 | 0.8088 | |
|
| 0.2859 | 2.58 | 76000 | 0.5677 | 0.8095 | |
|
| 0.2892 | 2.62 | 77000 | 0.5598 | 0.8113 | |
|
| 0.2898 | 2.65 | 78000 | 0.5618 | 0.8096 | |
|
| 0.2814 | 2.68 | 79000 | 0.5664 | 0.8103 | |
|
| 0.2917 | 2.72 | 80000 | 0.5484 | 0.8122 | |
|
| 0.2907 | 2.75 | 81000 | 0.5522 | 0.8116 | |
|
| 0.2896 | 2.79 | 82000 | 0.5540 | 0.8093 | |
|
| 0.2907 | 2.82 | 83000 | 0.5469 | 0.8104 | |
|
| 0.2882 | 2.85 | 84000 | 0.5471 | 0.8122 | |
|
| 0.2878 | 2.89 | 85000 | 0.5532 | 0.8108 | |
|
| 0.2858 | 2.92 | 86000 | 0.5511 | 0.8115 | |
|
| 0.288 | 2.96 | 87000 | 0.5491 | 0.8111 | |
|
| 0.2834 | 2.99 | 88000 | 0.5541 | 0.8111 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.12.3 |
|
- Pytorch 1.10.0+cu102 |
|
- Datasets 1.15.1 |
|
- Tokenizers 0.10.3 |
|
|