bartpho-word-base / README.md
dqnguyen's picture
Update README.md
75ead94 verified
|
raw
history blame contribute delete
No virus
995 Bytes
metadata
license: mit

BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese

The pre-trained model vinai/bartpho-word-base is the "base" variant of BARTpho-word, which uses the "base" architecture and pre-training scheme of the sequence-to-sequence denoising model BART. The general architecture and experimental results of BARTpho can be found in our paper:

@article{bartpho,
title     = {{BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese}},
author    = {Nguyen Luong Tran and Duong Minh Le and Dat Quoc Nguyen},
journal   = {arXiv preprint},
volume    = {arXiv:2109.09701},
year      = {2021}
}

Please CITE our paper when BARTpho is used to help produce published results or incorporated into other software.

For further information or requests, please go to BARTpho's homepage!