File size: 1,001 Bytes
778cfbc
 
 
 
 
 
 
 
 
 
f9c468f
 
 
dea5c5d
 
7986cf2
fdd398f
f63cf20
fdd398f
7986cf2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: apache-2.0
datasets:
- xquad
language:
- multilingual
library_name: transformers
tags:
- cross-lingual
- exctractive-question-answering
metrics:
- f1
- exact_match
---

# Description
Best-performing "mBERT-qa-en, skd" model from the paper [Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation](https://arxiv.org/abs/2309.17134).

Check the official [GitHub repository](https://github.com/ccasimiro88/self-distillation-gxlt-qa) to access the code used to implement the methods in the paper. 

**More info coming soon!**

# How to Cite
To cite our work use the following BibTex:
```
@misc{carrino2023promoting,
      title={Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation}, 
      author={Casimiro Pio Carrino and Carlos Escolano and José A. R. Fonollosa},
      year={2023},
      eprint={2309.17134},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
```