superlative-quantifier-lstm-2
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.9856
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 2
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 3052726
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
4.7739 | 0.03 | 76320 | 4.7591 |
4.4948 | 1.03 | 152640 | 4.4813 |
4.3521 | 0.03 | 228960 | 4.3474 |
4.2673 | 1.03 | 305280 | 4.2657 |
4.2044 | 0.03 | 381600 | 4.2106 |
4.1584 | 1.03 | 457920 | 4.1697 |
4.1237 | 0.03 | 534240 | 4.1397 |
4.0937 | 1.03 | 610560 | 4.1153 |
4.0646 | 0.03 | 686880 | 4.0961 |
4.0397 | 1.03 | 763200 | 4.0802 |
4.0188 | 0.03 | 839520 | 4.0677 |
3.9979 | 1.03 | 915840 | 4.0576 |
3.9875 | 0.03 | 992160 | 4.0478 |
3.9694 | 1.03 | 1068480 | 4.0406 |
3.9548 | 0.03 | 1144800 | 4.0342 |
3.9407 | 0.03 | 1221120 | 4.0289 |
3.9259 | 1.03 | 1297440 | 4.0236 |
3.9208 | 2.03 | 1373760 | 4.0188 |
3.9086 | 0.03 | 1450080 | 4.0158 |
3.9049 | 1.03 | 1526400 | 4.0122 |
3.9024 | 2.03 | 1602720 | 4.0090 |
3.8964 | 0.03 | 1679040 | 4.0069 |
3.8947 | 0.03 | 1755360 | 4.0043 |
3.8923 | 0.03 | 1831680 | 4.0024 |
3.8834 | 0.03 | 1908000 | 4.0003 |
3.8735 | 1.03 | 1984320 | 3.9990 |
3.8695 | 0.03 | 2060640 | 3.9974 |
3.8628 | 1.03 | 2136960 | 3.9960 |
3.8635 | 2.03 | 2213280 | 3.9948 |
3.8567 | 0.03 | 2289600 | 3.9936 |
3.8478 | 1.03 | 2365920 | 3.9924 |
3.8426 | 2.03 | 2442240 | 3.9914 |
3.8355 | 0.03 | 2518560 | 3.9904 |
3.8351 | 0.03 | 2594880 | 3.9893 |
3.8282 | 1.03 | 2671200 | 3.9884 |
3.8322 | 2.03 | 2747520 | 3.9875 |
3.8311 | 0.03 | 2823840 | 3.9872 |
3.8322 | 0.03 | 2900160 | 3.9865 |
3.8356 | 1.03 | 2976480 | 3.9860 |
3.8367 | 2.02 | 3052726 | 3.9856 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 16