Model Card for Model ID

This model is a finetuned version of the google-bert/bert-base-multilingual-cased . The model was finetuned on the Absinth dataset to predict for a given German news article and sentence a label indicating whether the sentence is faithful to the article or not.

Installation

Install necessary packages:

pip3 install transformers

Usage

Below is a minimal code snippet to run classification for a given article and summary sentences. The model outputs either Faithful, Intrinsic Hallucination or Extrinsic Hallucination. For more information about the labels, see here.

from typing import List, Dict
import torch
from transformers import pipeline

def generate_prompts_for_classification(article: str, summary_sentences: List[str]) -> List[Dict]:
    prompts = []
    for sentence in summary_sentences:
        prompt = {"text": article, "text_pair": sentence}
        prompts.append(prompt)
    return prompts

def predict_with_hf_classification_pipeline(prompts: List[Dict], model_name: str, max_context_length: int = 512,
                                            batch_size: int = 2) -> List[str]:
    device = "cuda" if torch.cuda.is_available() else "cpu"
    text_classification_pipeline = pipeline("text-classification", model=model_name, device=device,
                                            batch_size=batch_size)

    batch_output = text_classification_pipeline(prompts, truncation=True, max_length=max_context_length)
    predictions = [result['label'] for result in batch_output]
    return predictions

def main():

    model_name = "mtc/mbert-absinth-3-epochs"
    # Articles longer than 512 tokens will be truncated
    max_context_length = 512
    # Adjust batch_size according to your local gpu memory
    batch_size = 2 

    article = "Ein neuer Zirkus ist gestern in Zürich angekommen. Viele Familien besuchten das grosse Zelt, um die Vorstellung zu sehen. Es gab Akrobaten, Clowns und Tiere, die das Publikum begeisterten. Der Zirkus bleibt noch eine Woche in der Stadt und bietet täglich Vorstellungen an."

    summary_sentences = [
        "Ein Zirkus ist in Basel angekommen.",
        "Der Zirkus, der in 1950 gegründet wurde, wird von vielen Familien besucht."]

    prompts = generate_prompts_for_classification(article=article, summary_sentences=summary_sentences)
    predictions = predict_with_hf_classification_pipeline(prompts=prompts, model_name=model_name,
                                                 max_context_length=max_context_length, batch_size=batch_size)
    print(predictions)


if __name__ == '__main__':
    main()

The output should have the following format, when executing the code above:

[ Intrinsic Hallucination
  Extrinsic Hallucination]
Downloads last month
9
Safetensors
Model size
178M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train mtc/mbert-absinth-3-epochs