train_3
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1216
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 96
- eval_batch_size: 96
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_APEX_FUSED with betas=(0.826646043090655,0.991636944120939) and epsilon=3.4341677539323e-07 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 5000
- num_epochs: 200
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.0354 | 1.0 | 13725 | 0.1157 |
0.0345 | 2.0 | 27450 | 0.1158 |
0.0332 | 3.0 | 41175 | 0.1191 |
0.0322 | 4.0 | 54900 | 0.1190 |
0.0333 | 5.0 | 68625 | 0.1163 |
0.0323 | 6.0 | 82350 | 0.1172 |
0.0317 | 7.0 | 96075 | 0.1170 |
0.0306 | 8.0 | 109800 | 0.1182 |
0.0295 | 9.0 | 123525 | 0.1188 |
0.029 | 10.0 | 137250 | 0.1213 |
0.0286 | 11.0 | 150975 | 0.1197 |
0.0278 | 12.0 | 164700 | 0.1201 |
0.0272 | 13.0 | 178425 | 0.1212 |
0.0266 | 14.0 | 192150 | 0.1208 |
0.0266 | 15.0 | 205875 | 0.1208 |
0.0259 | 16.0 | 219600 | 0.1216 |
Framework versions
- Transformers 4.50.3
- Pytorch 2.6.0+cu126
- Datasets 3.3.0
- Tokenizers 0.21.1
- Downloads last month
- 14
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support