|
--- |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: full-lstm-3 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# full-lstm-3 |
|
|
|
This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 3.9685 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 3 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- training_steps: 3052726 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | |
|
|:-------------:|:-----:|:-------:|:---------------:| |
|
| 4.7832 | 0.03 | 76319 | 4.7485 | |
|
| 4.5027 | 0.03 | 152638 | 4.4708 | |
|
| 4.3608 | 0.03 | 228957 | 4.3370 | |
|
| 4.2693 | 1.03 | 305276 | 4.2557 | |
|
| 4.2052 | 0.03 | 381595 | 4.2002 | |
|
| 4.1539 | 1.03 | 457914 | 4.1587 | |
|
| 4.1188 | 0.03 | 534233 | 4.1278 | |
|
| 4.0903 | 0.03 | 610552 | 4.1043 | |
|
| 4.0598 | 1.03 | 686871 | 4.0848 | |
|
| 4.036 | 0.03 | 763190 | 4.0675 | |
|
| 4.0172 | 1.03 | 839509 | 4.0550 | |
|
| 4.001 | 0.03 | 915828 | 4.0447 | |
|
| 3.9809 | 0.03 | 992147 | 4.0355 | |
|
| 3.9667 | 0.03 | 1068467 | 4.0263 | |
|
| 3.9546 | 1.03 | 1144787 | 4.0188 | |
|
| 3.9525 | 0.03 | 1221107 | 4.0124 | |
|
| 3.9332 | 1.03 | 1297427 | 4.0074 | |
|
| 3.9251 | 0.03 | 1373747 | 4.0028 | |
|
| 3.9148 | 1.03 | 1450067 | 3.9989 | |
|
| 3.9065 | 0.03 | 1526387 | 3.9954 | |
|
| 3.9044 | 1.03 | 1602707 | 3.9925 | |
|
| 3.8995 | 0.03 | 1679027 | 3.9900 | |
|
| 3.8994 | 0.03 | 1755347 | 3.9872 | |
|
| 3.895 | 1.03 | 1831667 | 3.9849 | |
|
| 3.8861 | 0.03 | 1907987 | 3.9832 | |
|
| 3.8793 | 1.03 | 1984307 | 3.9809 | |
|
| 3.8748 | 0.03 | 2060627 | 3.9785 | |
|
| 3.8675 | 1.03 | 2136947 | 3.9774 | |
|
| 3.8656 | 0.03 | 2213267 | 3.9760 | |
|
| 3.8586 | 0.03 | 2289587 | 3.9746 | |
|
| 3.8518 | 1.03 | 2365907 | 3.9738 | |
|
| 3.85 | 0.03 | 2442227 | 3.9729 | |
|
| 3.8407 | 1.03 | 2518547 | 3.9720 | |
|
| 3.8388 | 0.03 | 2594867 | 3.9711 | |
|
| 3.8321 | 1.03 | 2671187 | 3.9704 | |
|
| 3.8326 | 0.03 | 2747507 | 3.9700 | |
|
| 3.8354 | 0.03 | 2823827 | 3.9696 | |
|
| 3.8349 | 1.03 | 2900147 | 3.9691 | |
|
| 3.8397 | 0.03 | 2976467 | 3.9687 | |
|
| 3.8387 | 0.02 | 3052726 | 3.9685 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.3 |
|
- Pytorch 2.0.1 |
|
- Datasets 2.12.0 |
|
- Tokenizers 0.13.3 |
|
|