bleurt-large-512 / README.md
Elron's picture
Update README.md
00397b0
|
raw
history blame
998 Bytes

BLEURT

Pytorch version of the original BLEURT models from ACL paper "BLEURT: Learning Robust Metrics for Text Generation" by Thibault Sellam, Dipanjan Das and Ankur P. Parikh of Google Research.

The code for model conversion was originated from this notebook mentioned here.

Usage Example

from transformers import AutoModelForSequenceClassification, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("Elron/bleurt-large-512")
model = AutoModelForSequenceClassification.from_pretrained("Elron/bleurt-large-512")
model.eval()

references = ["hello world", "hello world"]
candidates = ["hi universe", "bye world"]

with torch.no_grad():
  scores = model(**tokenizer(references, candidates, return_tensors='pt'))[0].squeeze()
print(scores) # tensor([0.9877, 0.0475])