bert-base-uncased-finetuned-negation_scope
This model is a fine-tuned version of bert-base-uncased on the SEM 2012 shared task corpus (cd-sco). It achieves the following results on the evaluation set:
- Loss: 0.0618
- Token Precision: 0.9190
- Token Recall: 0.8868
- Token F1: 0.9026
- Span Precision: 0.625
- Span Recall: 0.625
- Span F1: 0.625
Model description
We follow the Augment method described in NegBERT (Khandelwal, et al. 2020). That is, adding a special token ([NEG]) immediately before the predicate:
This is [NEG] not a sentence.
Note that the special token and the predicate is considered a whole. That is, the actual sentence is like
'This' 'is' '[NEG] not' 'a' 'sentence' '.'
Intended uses & limitations
See details at https://github.com/dannashao/portfolio-NLP/blob/main/NEG/Fine%20tune%20BERT.ipynb
Training and evaluation data
See details at https://www.clips.ua.ac.be/sem2012-st-neg/
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Token Precision | Token Recall | Token F1 | Span Precision | Span Recall | Span F1 |
---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 237 | 0.0624 | 0.9121 | 0.8368 | 0.8728 | 0.5207 | 0.5207 | 0.5207 |
No log | 2.0 | 474 | 0.0682 | 0.9366 | 0.8311 | 0.8807 | 0.6012 | 0.6012 | 0.6012 |
0.0722 | 3.0 | 711 | 0.0618 | 0.9190 | 0.8868 | 0.9026 | 0.625 | 0.625 | 0.625 |
Framework versions
- Transformers 4.37.0
- Pytorch 2.0.1+cu117
- Datasets 2.16.1
- Tokenizers 0.15.1
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for dannashao/bert-base-uncased-finetuned-negation_scope
Base model
google-bert/bert-base-uncased