twitter-roberta-base-sentiment-earthquake

This is an "extension" of the twitter-roberta-base-sentiment-latest model, further finetuned with original Twitter data posted in English about the 10th anniversary of the 2010 Haiti Earthquake.

Full classification example

from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np

class_mapping = {0: "Negative", 1: "Neutral", 2: "Positive"}

MODEL = "antypasd/twitter-roberta-base-sentiment-earthquake"

tokenizer = AutoTokenizer.from_pretrained(MODEL)

# PT
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
model.save_pretrained(MODEL)

text = "$202 million of $1.14 billion in United States (US) ​recovery aid went to a new 'industrial park' in Caracol, an area unaffected by the Haiti earthquake. The plan was to invite foreign garment companies to take advantage of extremely low-wage labor"
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
prediction = np.argmax(scores)


# # TF
# model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
# model.save_pretrained(MODEL)

# encoded_input = tokenizer(text, return_tensors='tf')
# output = model(encoded_input)
# scores = output[0][0].numpy()
# prediction = np.argmax(scores)

# Print label
print(class_mapping[prediction])

Output:

Negative
Downloads last month
23
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.