|
--- |
|
license: cc-by-4.0 |
|
datasets: |
|
- dell-research-harvard/newswire |
|
language: |
|
- en |
|
pipeline_tag: text-classification |
|
tags: |
|
- distilroberta |
|
- topic |
|
- news |
|
|
|
widget: |
|
- text: "Diplomatic efforts to deal with the world’s two wars — the civil war in Spain and the undeclared Chinese - Japanese conflict — received sharp setbacks today." |
|
- text: "WASHINGTON. AP. A decisive development appeared in the offing in the tug-of-war between the federal government and the states over the financing of relief." |
|
- text: "A frantic bride called the Rochester Gas and Electric corporation to complain that her new refrigerator “freezes ice cubes too fast.”" |
|
|
|
--- |
|
|
|
# Fine-tuned DistilRoBERTa-base for detecting news on politics |
|
|
|
# Model Description |
|
|
|
This model is a finetuned RoBERTa-large, for classifying whether news articles are about politics. |
|
|
|
# How to Use |
|
|
|
```python |
|
from transformers import pipeline |
|
classifier = pipeline("sentiment-analysis", model="dell-research-harvard/topic-politics") |
|
classifier("Kennedy wins election") |
|
``` |
|
|
|
# Contact |
|
|
|
|
|
# Reference |
|
|
|
|