switch-base-8-finetuned
This model is a fine-tuned version of google/switch-base-8 on the SemEval-2018-Task-2 emojis english dataset. It achieves the following results on the evaluation set:
- Accuracy: 50.174 %
- Mac-F1: 36.660 %
Model description
More information needed
- Model type: Language model
- Language(s) (NLP): English
- License: Apache 2.0
- Related Models: All Switch Transformers Checkpoints
- Original Checkpoints: All Original Switch Transformers Checkpoints
- Resources for more information:
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-4
- train_batch_size: 464
- eval_batch_size: 512
- seed: 42
- num_epochs: 30
Testing results
SemEval Testing Data | accuracy | Mac-F1 |
---|---|---|
"Tubingen-Oslo" First SemEval Team | 47.09% | 35.99% |
switch-base-8-finetuned-SemEval-2018-emojis-cen-1 | 48.040% | 33.239% |
switch-base-8-finetuned-SemEval-2018-emojis-cen-2 | 50.174% | 36.660% |
switch-base-8-finetuned-SemEval-2018-emojis-IID-Fed | 50.750% | 37.355% |
Google colab to test the models on SemEval test dataset : The Notebook
Framework versions
- Transformers 4.25.1
- Pytorch 1.13.1+cu116
- Tokenizers 0.13.2
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.