Roberta

This model can tackle the zero-width non-joiner character for Persian writing. Also, the model was trained on new multi-types corpora with a new set of vocabulary.

Questions?

Post a Github issue on the ParsRoBERTa Issues repo.

Downloads last month
931
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for HooshvareLab/roberta-fa-zwnj-base

Finetunes
5 models