--- language: - it pipeline_tag: translation --- To initialize the model: ```python from transformers import MBartForConditionalGeneration, MBart50TokenizerFast model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50", output_hidden_states=True) ``` To generate text using the model: ```python tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50", src_lang="it_IT", tgt_lang="it_IT") input_ids = tokenizer.encode("Input text", return_tensors="pt") output = model.generate(input_ids) ```