badokorach/distilbert-base-cased-distilled-agric-060124
This model is a fine-tuned version of badokorach/distilbert-base-cased-distilled-agric-1831223 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.1227
- Validation Loss: 0.0
- Epoch: 19
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 1e-05, 'decay_steps': 2320, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.02}
- training_precision: mixed_float16
Training results
Train Loss | Validation Loss | Epoch |
---|---|---|
0.8713 | 0.0 | 0 |
0.6010 | 0.0 | 1 |
0.4727 | 0.0 | 2 |
0.4142 | 0.0 | 3 |
0.3676 | 0.0 | 4 |
0.2935 | 0.0 | 5 |
0.2651 | 0.0 | 6 |
0.2458 | 0.0 | 7 |
0.2323 | 0.0 | 8 |
0.2105 | 0.0 | 9 |
0.1821 | 0.0 | 10 |
0.1866 | 0.0 | 11 |
0.1755 | 0.0 | 12 |
0.1666 | 0.0 | 13 |
0.1493 | 0.0 | 14 |
0.1430 | 0.0 | 15 |
0.1415 | 0.0 | 16 |
0.1307 | 0.0 | 17 |
0.1224 | 0.0 | 18 |
0.1227 | 0.0 | 19 |
Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 18
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.