distilbert-base-uncased-distilled-clinc
This model is a fine-tuned version of distilbert-base-uncased on the clinc_oos dataset. It achieves the following results on the evaluation set:
- Loss: 0.2649
- Accuracy: 0.9506
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.3973 | 1.0 | 318 | 1.7013 | 0.7455 |
1.2921 | 2.0 | 636 | 0.8241 | 0.8703 |
0.6498 | 3.0 | 954 | 0.4660 | 0.9232 |
0.3856 | 4.0 | 1272 | 0.3484 | 0.9387 |
0.2824 | 5.0 | 1590 | 0.3083 | 0.9468 |
0.2418 | 6.0 | 1908 | 0.2914 | 0.9497 |
0.2213 | 7.0 | 2226 | 0.2807 | 0.9494 |
0.2098 | 8.0 | 2544 | 0.2763 | 0.9510 |
0.2027 | 9.0 | 2862 | 0.2727 | 0.9497 |
0.1986 | 10.0 | 3180 | 0.2718 | 0.9490 |
0.1944 | 11.0 | 3498 | 0.2673 | 0.9519 |
0.1922 | 12.0 | 3816 | 0.2660 | 0.95 |
0.1903 | 13.0 | 4134 | 0.2662 | 0.9506 |
0.1888 | 14.0 | 4452 | 0.2652 | 0.9510 |
0.1885 | 15.0 | 4770 | 0.2649 | 0.9506 |
Framework versions
- Transformers 4.16.2
- Pytorch 2.4.1+cu121
- Datasets 1.16.1
- Tokenizers 0.19.1
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.