mBART-Large multilingual detoxification model
This is a detoxification model trained on released parallel corpus (dev part) of toxic texts MultiParadetox
Model Details
The base model for this fine-tune is mbart-large-50.
The model shows the following metrics on test set
STA | SIM | CHRF | J | |
---|---|---|---|---|
Amharic | 0.51 | 0.91 | 0.41 | 0.20 |
Arabic | 0.56 | 0.95 | 0.74 | 0.40 |
Chinese | 0.17 | 0.96 | 0.43 | 0.07 |
English | 0.49 | 0.93 | 0.70 | 0.34 |
German | 0.53 | 0.97 | 0.79 | 0.41 |
Hindi | 0.23 | 0.94 | 0.70 | 0.17 |
Russian | 0.45 | 0.94 | 0.71 | 0.32 |
Spanish | 0.47 | 0.93 | 0.64 | 0.29 |
Ukrainian | 0.46 | 0.94 | 0.75 | 0.35 |
STA - style accuracy
SIM - content similarity
CHRF - Fluency
J - joint
For more details about the metrics and data refer to the shared task page and the papers mentioned in citations section.
Citation
The model is developed as a baseline for TextDetox CLEF-2024 shared task.
If you would like to acknowledge our work, please, cite the following manuscripts:
@inproceedings{dementieva2024overview,
title={Overview of the Multilingual Text Detoxification Task at PAN 2024},
author={Dementieva, Daryna and Moskovskiy, Daniil and Babakov, Nikolay and Ayele, Abinew Ali and Rizwan, Naquee and Schneider, Frolian and Wang, Xintog and Yimam, Seid Muhie and Ustalov, Dmitry and Stakovskii, Elisei and Smirnova, Alisa and Elnagar, Ashraf and Mukherjee, Animesh and Panchenko, Alexander},
booktitle={Working Notes of CLEF 2024 - Conference and Labs of the Evaluation Forum},
editor={Guglielmo Faggioli and Nicola Ferro and Petra Galu{\v{s}}{\v{c}}{\'a}kov{\'a} and Alba Garc{\'i}a Seco de Herrera},
year={2024},
organization={CEUR-WS.org}
}
@inproceedings{DBLP:conf/ecir/BevendorffCCDEFFKMMPPRRSSSTUWZ24,
author = {Janek Bevendorff and
Xavier Bonet Casals and
Berta Chulvi and
Daryna Dementieva and
Ashaf Elnagar and
Dayne Freitag and
Maik Fr{\"{o}}be and
Damir Korencic and
Maximilian Mayerl and
Animesh Mukherjee and
Alexander Panchenko and
Martin Potthast and
Francisco Rangel and
Paolo Rosso and
Alisa Smirnova and
Efstathios Stamatatos and
Benno Stein and
Mariona Taul{\'{e}} and
Dmitry Ustalov and
Matti Wiegmann and
Eva Zangerle},
editor = {Nazli Goharian and
Nicola Tonellotto and
Yulan He and
Aldo Lipani and
Graham McDonald and
Craig Macdonald and
Iadh Ounis},
title = {Overview of {PAN} 2024: Multi-author Writing Style Analysis, Multilingual
Text Detoxification, Oppositional Thinking Analysis, and Generative
{AI} Authorship Verification - Extended Abstract},
booktitle = {Advances in Information Retrieval - 46th European Conference on Information
Retrieval, {ECIR} 2024, Glasgow, UK, March 24-28, 2024, Proceedings,
Part {VI}},
series = {Lecture Notes in Computer Science},
volume = {14613},
pages = {3--10},
publisher = {Springer},
year = {2024},
url = {https://doi.org/10.1007/978-3-031-56072-9\_1},
doi = {10.1007/978-3-031-56072-9\_1},
timestamp = {Fri, 29 Mar 2024 23:01:36 +0100},
biburl = {https://dblp.org/rec/conf/ecir/BevendorffCCDEFFKMMPPRRSSSTUWZ24.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
- Downloads last month
- 31
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for textdetox/mbart-detox-baseline
Base model
facebook/mbart-large-50