File size: 6,969 Bytes
a1262ea b38f27b a1262ea c3a3ff3 24de895 44d8322 e41d963 746df02 17fbbfe a46da16 17d35ff 5cc4342 c426dc1 ce45537 c06e7f1 56414db 77882bf a479fc3 b223860 9dbf8ea 53741d0 36e4b16 2a236b6 c2bc997 8e1cddf c66c625 d50a39b dbe0a29 323919c d54825e a2ddf1c 36d33df bf7ebcc 71624f2 0c202df 7494c13 c88d031 7696e5d 225c798 5b83697 bc21343 4e31b47 27e3420 5940fac 6e3ce3d ef992b5 4f56fd5 d5ae141 891f130 fc1d24a 2ae7b9e 8afdd1f 98400dd 9239ffe d0d0b5a 2757e58 be0bf81 cc89ac5 49403ac ebd2cf7 7ad48b1 cc0aa93 c5aa30c 7e30e48 4d6feaf 50dcdda 84b7ee7 753e7e7 85e6817 b2cbd09 f5772ed 9df911c 3aa315b 5504f13 386e427 90ab3ce a219f25 7314821 ffaf1f0 c541471 9979285 7446ee0 b89d71e 438681d 187ee4e 9b43f49 899d3a8 6abdc8b 965b304 78bb999 69e4af9 a2ec60c 123fca7 9893729 d05721c 5e6f7dd 08ea790 6a435cd bb9a75b 59dd5cf f594f09 0f1b37c 3fd9c58 4a693e1 55c7ed9 b5b6ed9 06ac3e7 ce27527 dbba31a 563b8e8 f47eb6e 2360fbf 1ecc051 9563615 6ce50c0 24dd2b3 766a74d 5016677 020fcf7 1d0ebb3 70dfec8 69d2abf 3d43cec fe98da0 1231c6a bb3e8be 5a3c98b 04b5cbb 1b1439c 104eb9d 8011a60 739cba3 2ef6ea3 09e04cf 5b838b7 77042ff ff5607b 88c3144 e113848 e6231ba 63e40c3 b38f27b a1262ea |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 |
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: tarsssss/eng-jagoy-t5-001
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# tarsssss/eng-jagoy-t5-001
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 4.7399
- Validation Loss: 5.1356
- Epoch: 138
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.8603 | 7.4105 | 0 |
| 7.3775 | 7.1273 | 1 |
| 7.1632 | 6.9598 | 2 |
| 7.0228 | 6.8372 | 3 |
| 6.9085 | 6.7335 | 4 |
| 6.8226 | 6.6458 | 5 |
| 6.7451 | 6.5671 | 6 |
| 6.6785 | 6.5022 | 7 |
| 6.6254 | 6.4409 | 8 |
| 6.5606 | 6.3842 | 9 |
| 6.5163 | 6.3361 | 10 |
| 6.4682 | 6.2908 | 11 |
| 6.4250 | 6.2436 | 12 |
| 6.3749 | 6.1907 | 13 |
| 6.3293 | 6.1494 | 14 |
| 6.2822 | 6.1098 | 15 |
| 6.2560 | 6.0750 | 16 |
| 6.2078 | 6.0508 | 17 |
| 6.1839 | 6.0229 | 18 |
| 6.1561 | 5.9944 | 19 |
| 6.1146 | 5.9732 | 20 |
| 6.0885 | 5.9490 | 21 |
| 6.0587 | 5.9243 | 22 |
| 6.0366 | 5.9064 | 23 |
| 6.0135 | 5.8857 | 24 |
| 5.9904 | 5.8675 | 25 |
| 5.9681 | 5.8482 | 26 |
| 5.9473 | 5.8262 | 27 |
| 5.9263 | 5.8127 | 28 |
| 5.9031 | 5.7896 | 29 |
| 5.8827 | 5.7721 | 30 |
| 5.8566 | 5.7482 | 31 |
| 5.8406 | 5.7355 | 32 |
| 5.8285 | 5.7231 | 33 |
| 5.7944 | 5.7049 | 34 |
| 5.7822 | 5.6968 | 35 |
| 5.7567 | 5.6813 | 36 |
| 5.7526 | 5.6650 | 37 |
| 5.7363 | 5.6614 | 38 |
| 5.7132 | 5.6398 | 39 |
| 5.6945 | 5.6383 | 40 |
| 5.6786 | 5.6243 | 41 |
| 5.6636 | 5.6071 | 42 |
| 5.6527 | 5.5955 | 43 |
| 5.6390 | 5.5876 | 44 |
| 5.6198 | 5.5754 | 45 |
| 5.6082 | 5.5663 | 46 |
| 5.6070 | 5.5572 | 47 |
| 5.5782 | 5.5493 | 48 |
| 5.5679 | 5.5487 | 49 |
| 5.5520 | 5.5301 | 50 |
| 5.5307 | 5.5261 | 51 |
| 5.5284 | 5.5089 | 52 |
| 5.5160 | 5.5003 | 53 |
| 5.4976 | 5.4981 | 54 |
| 5.4864 | 5.4860 | 55 |
| 5.4795 | 5.4816 | 56 |
| 5.4653 | 5.4652 | 57 |
| 5.4484 | 5.4639 | 58 |
| 5.4335 | 5.4580 | 59 |
| 5.4231 | 5.4454 | 60 |
| 5.4132 | 5.4358 | 61 |
| 5.4064 | 5.4349 | 62 |
| 5.3886 | 5.4261 | 63 |
| 5.3913 | 5.4193 | 64 |
| 5.3692 | 5.4138 | 65 |
| 5.3556 | 5.4028 | 66 |
| 5.3469 | 5.4001 | 67 |
| 5.3421 | 5.3942 | 68 |
| 5.3194 | 5.3826 | 69 |
| 5.3243 | 5.3799 | 70 |
| 5.3081 | 5.3713 | 71 |
| 5.2921 | 5.3737 | 72 |
| 5.2845 | 5.3681 | 73 |
| 5.2754 | 5.3601 | 74 |
| 5.2594 | 5.3524 | 75 |
| 5.2527 | 5.3420 | 76 |
| 5.2496 | 5.3367 | 77 |
| 5.2360 | 5.3320 | 78 |
| 5.2193 | 5.3253 | 79 |
| 5.2141 | 5.3178 | 80 |
| 5.1993 | 5.3150 | 81 |
| 5.1923 | 5.3157 | 82 |
| 5.1875 | 5.3097 | 83 |
| 5.1776 | 5.3051 | 84 |
| 5.1693 | 5.3050 | 85 |
| 5.1533 | 5.3115 | 86 |
| 5.1567 | 5.2943 | 87 |
| 5.1348 | 5.2757 | 88 |
| 5.1317 | 5.2849 | 89 |
| 5.1191 | 5.2846 | 90 |
| 5.1102 | 5.2742 | 91 |
| 5.1054 | 5.2725 | 92 |
| 5.0944 | 5.2624 | 93 |
| 5.0906 | 5.2560 | 94 |
| 5.0712 | 5.2502 | 95 |
| 5.0719 | 5.2495 | 96 |
| 5.0628 | 5.2498 | 97 |
| 5.0597 | 5.2454 | 98 |
| 5.0402 | 5.2420 | 99 |
| 5.0308 | 5.2441 | 100 |
| 5.0193 | 5.2379 | 101 |
| 5.0198 | 5.2298 | 102 |
| 5.0110 | 5.2315 | 103 |
| 5.0087 | 5.2304 | 104 |
| 4.9906 | 5.2261 | 105 |
| 4.9883 | 5.2288 | 106 |
| 4.9818 | 5.2069 | 107 |
| 4.9612 | 5.2003 | 108 |
| 4.9560 | 5.2009 | 109 |
| 4.9453 | 5.2123 | 110 |
| 4.9385 | 5.2136 | 111 |
| 4.9238 | 5.2178 | 112 |
| 4.9291 | 5.1994 | 113 |
| 4.9097 | 5.1940 | 114 |
| 4.9093 | 5.1840 | 115 |
| 4.9057 | 5.1824 | 116 |
| 4.8907 | 5.1894 | 117 |
| 4.8919 | 5.1841 | 118 |
| 4.8699 | 5.1806 | 119 |
| 4.8671 | 5.1795 | 120 |
| 4.8629 | 5.1696 | 121 |
| 4.8552 | 5.1646 | 122 |
| 4.8414 | 5.1709 | 123 |
| 4.8444 | 5.1534 | 124 |
| 4.8330 | 5.1698 | 125 |
| 4.8231 | 5.1501 | 126 |
| 4.8198 | 5.1565 | 127 |
| 4.8004 | 5.1522 | 128 |
| 4.7996 | 5.1478 | 129 |
| 4.7915 | 5.1409 | 130 |
| 4.7845 | 5.1484 | 131 |
| 4.7837 | 5.1476 | 132 |
| 4.7727 | 5.1446 | 133 |
| 4.7729 | 5.1379 | 134 |
| 4.7628 | 5.1379 | 135 |
| 4.7568 | 5.1359 | 136 |
| 4.7400 | 5.1292 | 137 |
| 4.7399 | 5.1356 | 138 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.10.0
- Datasets 2.15.0
- Tokenizers 0.13.3
|