Regression_albert_5
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.1548
- Train Mae: 0.2765
- Train Mse: 0.1336
- Train R2-score: 0.7547
- Train Accuracy: 0.7462
- Validation Loss: 0.1908
- Validation Mae: 0.3787
- Validation Mse: 0.1894
- Validation R2-score: 0.8458
- Validation Accuracy: 0.4595
- Epoch: 9
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 2e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Train Mae | Train Mse | Train R2-score | Train Accuracy | Validation Loss | Validation Mae | Validation Mse | Validation R2-score | Validation Accuracy | Epoch |
---|---|---|---|---|---|---|---|---|---|---|
0.5723 | 0.3984 | 0.2343 | 0.4755 | 0.5923 | 0.1856 | 0.3686 | 0.1843 | 0.8559 | 0.4324 | 0 |
0.1822 | 0.2906 | 0.1403 | 0.7246 | 0.6538 | 0.1577 | 0.3485 | 0.1561 | 0.8714 | 0.9459 | 1 |
0.1765 | 0.2865 | 0.1376 | 0.6770 | 0.6538 | 0.1356 | 0.3325 | 0.1337 | 0.8808 | 0.9459 | 2 |
0.1959 | 0.2945 | 0.1383 | 0.6806 | 0.7308 | 0.2115 | 0.4054 | 0.2104 | 0.8366 | 0.3243 | 3 |
0.1698 | 0.2906 | 0.1408 | 0.7195 | 0.6231 | 0.1489 | 0.3371 | 0.1472 | 0.8726 | 0.9459 | 4 |
0.2081 | 0.2687 | 0.1178 | 0.7632 | 0.8385 | 0.2547 | 0.4572 | 0.2539 | 0.8046 | 0.3243 | 5 |
0.1806 | 0.3087 | 0.1554 | 0.7168 | 0.6462 | 0.1477 | 0.3401 | 0.1460 | 0.8757 | 0.9459 | 6 |
0.1910 | 0.3102 | 0.1559 | 0.7295 | 0.6308 | 0.1726 | 0.3544 | 0.1711 | 0.8602 | 0.8919 | 7 |
0.1697 | 0.2609 | 0.1132 | 0.7876 | 0.8538 | 0.1856 | 0.3694 | 0.1843 | 0.8537 | 0.5946 | 8 |
0.1548 | 0.2765 | 0.1336 | 0.7547 | 0.7462 | 0.1908 | 0.3787 | 0.1894 | 0.8458 | 0.4595 | 9 |
Framework versions
- Transformers 4.27.2
- TensorFlow 2.11.0
- Datasets 2.10.1
- Tokenizers 0.13.2
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.