File size: 2,950 Bytes
a1262ea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5940fac
 
 
a1262ea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c3a3ff3
24de895
44d8322
e41d963
746df02
17fbbfe
a46da16
17d35ff
5cc4342
c426dc1
ce45537
c06e7f1
56414db
77882bf
a479fc3
b223860
9dbf8ea
53741d0
36e4b16
2a236b6
c2bc997
8e1cddf
c66c625
d50a39b
dbe0a29
323919c
d54825e
a2ddf1c
36d33df
bf7ebcc
71624f2
0c202df
7494c13
c88d031
7696e5d
225c798
5b83697
bc21343
4e31b47
27e3420
5940fac
a1262ea
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: tarsssss/eng-jagoy-t5-001
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# tarsssss/eng-jagoy-t5-001

This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 5.6945
- Validation Loss: 5.6383
- Epoch: 40

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.8603     | 7.4105          | 0     |
| 7.3775     | 7.1273          | 1     |
| 7.1632     | 6.9598          | 2     |
| 7.0228     | 6.8372          | 3     |
| 6.9085     | 6.7335          | 4     |
| 6.8226     | 6.6458          | 5     |
| 6.7451     | 6.5671          | 6     |
| 6.6785     | 6.5022          | 7     |
| 6.6254     | 6.4409          | 8     |
| 6.5606     | 6.3842          | 9     |
| 6.5163     | 6.3361          | 10    |
| 6.4682     | 6.2908          | 11    |
| 6.4250     | 6.2436          | 12    |
| 6.3749     | 6.1907          | 13    |
| 6.3293     | 6.1494          | 14    |
| 6.2822     | 6.1098          | 15    |
| 6.2560     | 6.0750          | 16    |
| 6.2078     | 6.0508          | 17    |
| 6.1839     | 6.0229          | 18    |
| 6.1561     | 5.9944          | 19    |
| 6.1146     | 5.9732          | 20    |
| 6.0885     | 5.9490          | 21    |
| 6.0587     | 5.9243          | 22    |
| 6.0366     | 5.9064          | 23    |
| 6.0135     | 5.8857          | 24    |
| 5.9904     | 5.8675          | 25    |
| 5.9681     | 5.8482          | 26    |
| 5.9473     | 5.8262          | 27    |
| 5.9263     | 5.8127          | 28    |
| 5.9031     | 5.7896          | 29    |
| 5.8827     | 5.7721          | 30    |
| 5.8566     | 5.7482          | 31    |
| 5.8406     | 5.7355          | 32    |
| 5.8285     | 5.7231          | 33    |
| 5.7944     | 5.7049          | 34    |
| 5.7822     | 5.6968          | 35    |
| 5.7567     | 5.6813          | 36    |
| 5.7526     | 5.6650          | 37    |
| 5.7363     | 5.6614          | 38    |
| 5.7132     | 5.6398          | 39    |
| 5.6945     | 5.6383          | 40    |


### Framework versions

- Transformers 4.33.2
- TensorFlow 2.10.0
- Datasets 2.15.0
- Tokenizers 0.13.3