File size: 375 Bytes
82c4e48
 
 
 
1
2
3
4
5
Trained from scratch multilingual language model (XLM-Roberta architecture) from our AACL 2022 paper Cross-lingual Similarity of Multilingual Representations Revisited. 

Paper (model and training description): https://aclanthology.org/2022.aacl-main.15/ </br>
GitHub repo: https://github.com/delmaksym/xsim#cross-lingual-similarity-of-multilingual-representations-revisited