--- language: - it pipeline_tag: translation --- To initialize the model: from transformers import MBartForConditionalGeneration, MBart50TokenizerFast model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50", output_hidden_states=True) To generate text using the model: tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50", src_lang="it_IT", tgt_lang="it_IT") input_ids = tokenizer("I was here yesterday to studying",text_target="I was here yesterday to study", return_tensors='pt') output = hidden_states = model.generate(input_ids,attention_mask=mask,max_length=max_length, num_beams=4,forced_bos_token_id=tokenizer_it.lang_code_to_id["it_IT"])