mBART (a pre-trained model by Facebook) is pre-trained to de-noise multiple languages simultaneously with BART objective.

Checkpoint available in this repository is obtained after fine-tuning facebook/mbart-large-cc25 on all samples (~260K) from Bhasha (pib_v1.3) Hindi-English parallel corpus. This checkpoint gives decent results for Hindi-english translation.

Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train vasudevgupta/mbart-bhasha-hin-eng

Space using vasudevgupta/mbart-bhasha-hin-eng 1