Update README.md
Browse files
README.md
CHANGED
@@ -13,6 +13,22 @@ metrics:
|
|
13 |
- exact_match
|
14 |
---
|
15 |
|
|
|
16 |
Best-performing "mBERT-qa-en, skd" model from the paper [Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation](https://arxiv.org/abs/2309.17134).
|
17 |
|
18 |
Check the official [GitHub repository](https://github.com/ccasimiro88/self-distillation-gxlt-qa) to access the code used to implement the methods in the paper.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
- exact_match
|
14 |
---
|
15 |
|
16 |
+
# Description
|
17 |
Best-performing "mBERT-qa-en, skd" model from the paper [Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation](https://arxiv.org/abs/2309.17134).
|
18 |
|
19 |
Check the official [GitHub repository](https://github.com/ccasimiro88/self-distillation-gxlt-qa) to access the code used to implement the methods in the paper.
|
20 |
+
|
21 |
+
**More info coming soon!**
|
22 |
+
|
23 |
+
# How to Cite
|
24 |
+
To cite our work use the following BibTex:
|
25 |
+
```
|
26 |
+
@misc{carrino2023promoting,
|
27 |
+
title={Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation},
|
28 |
+
author={Casimiro Pio Carrino and Carlos Escolano and José A. R. Fonollosa},
|
29 |
+
year={2023},
|
30 |
+
eprint={2309.17134},
|
31 |
+
archivePrefix={arXiv},
|
32 |
+
primaryClass={cs.CL}
|
33 |
+
}
|
34 |
+
```
|