Sentence Similarity
Safetensors
Japanese
RAGatouille
bert
ColBERT
bclavie commited on
Commit
2d4a77a
1 Parent(s): 1b64ef3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -14
README.md CHANGED
@@ -20,20 +20,20 @@ Under Construction, please come back in a few days!
20
 
21
  # Results
22
 
23
- (refer to the technical report for exact evaluation method + code)
24
-
25
- | | JSQuAD | | | MIRACL | | | MrTyDi | | | Average | | |
26
- | ------------------------------------------------------------------------ | ----------------------- | -------------------- | ------ | ----------------------- | -------------------- | ------ | ----------------------- | -------------------- | ------ | ----------------------- | -------------------- | ------ |
27
- | | R@1 | R@5 | R@10 | R@3 | R@5 | R@10 | R@3 | R@5 | R@10 | R@\{1\|3\} | R@5 | R@10 |
28
- | JaColBERT | **0.906** | **0.968** | 0.978 | 0.464 | 0.546 | 0.645 | 0.744 | 0.781 | 0.821 | **0.705** | 0.765 | 0.813 |
29
- | m-e5-large (in-domain) | | | | | | | | | | | | |
30
- | m-e5-base (in-domain) | *0.838* | *0.955* | 0.973 | **0.482** | **0.553** | 0.632 | **0.777** | **0.815** | 0.857 | 0.699 | **0.775** | 0.820 |
31
- | m-e5-small (in-domain) | *0.840* | *0.954* | 0.973 | 0.464 | 0.540 | 0.640 | 0.767 | 0.794 | 0.844 | 0.690 | 0.763 | 0.819 |
32
- | GLuCoSE | 0.645 | 0.846 | 0.897 | 0.369 | 0.432 | 0.515 | *0.617* | *0.670* | 0.735 | 0.544 | 0.649 | 0.716 |
33
- | sentence-bert-base-ja-v2 | 0.654 | 0.863 | 0.914 | 0.172 | 0.224 | 0.338 | 0.488 | 0.549 | 0.611 | 0.435 | 0.545 | 0.621 |
34
- | sup-simcse-ja-base | 0.632 | 0.849 | 0.897 | 0.133 | 0.177 | 0.264 | 0.454 | 0.514 | 0.580 | 0.406 | 0.513 | 0.580 |
35
- | sup-simcse-ja-large | 0.603 | 0.833 | 0.889 | 0.159 | 0.212 | 0.295 | 0.457 | 0.517 | 0.581 | 0.406 | 0.521 | 0.588 |
36
- | fio-base-v0.1 | 0.700 | 0.879 | 0.924 | *0.279* | *0.358* | 0.462 | *0.582* | *0.649* | 0.712 | *0.520* | *0.629* | 0.699 |
37
 
38
 
39
 
 
20
 
21
  # Results
22
 
23
+ (refer to the technical report for exact evaluation method + code. * indicates the best monolingual/out-of-domain result. **bold** is best overall result. _italic_ indicates the task is in-domain for the model.)
24
+
25
+ | | JSQuAD | | | | MIRACL | | | | MrTyDi | | | | Average | | |
26
+ | ------------------------------------------------------------------------ | ----------------------- | -------------------- | ------ | ----------------------- | ----------------------- | -------------------- | ------ | ----------------------- | ----------------------- | -------------------- | ------ | ----------------------- | ----------------------- | -------------------- | ------ |
27
+ | | R@1 | R@5 | R@10 | | R@3 | R@5 | R@10 | | R@3 | R@5 | R@10 | | R@\{1\|3\} | R@5 | R@10 |
28
+ | JaColBERT | **0.906*** | **0.968*** | 0.978* | | 0.464* | 0.546* | 0.645* | | 0.744* | 0.781* | 0.821* | | **0.705*** | 0.765* | 0.813* |
29
+ | m-e5-large (in-domain) | | | | | | | | | | | | | | | |
30
+ | m-e5-base (in-domain) | *0.838* | *0.955* | 0.973 | | **0.482** | **0.553** | 0.632 | | **0.777** | **0.815** | 0.857 | | 0.699 | **0.775** | 0.820 |
31
+ | m-e5-small (in-domain) | *0.840* | *0.954* | 0.973 | | 0.464 | 0.540 | 0.640 | | 0.767 | 0.794 | 0.844 | | 0.690 | 0.763 | 0.819 |
32
+ | GLuCoSE | 0.645 | 0.846 | 0.897 | | 0.369 | 0.432 | 0.515 | | *0.617* | *0.670* | 0.735 | | 0.544 | 0.649 | 0.716 |
33
+ | sentence-bert-base-ja-v2 | 0.654 | 0.863 | 0.914 | | 0.172 | 0.224 | 0.338 | | 0.488 | 0.549 | 0.611 | | 0.435 | 0.545 | 0.621 |
34
+ | sup-simcse-ja-base | 0.632 | 0.849 | 0.897 | | 0.133 | 0.177 | 0.264 | | 0.454 | 0.514 | 0.580 | | 0.406 | 0.513 | 0.580 |
35
+ | sup-simcse-ja-large | 0.603 | 0.833 | 0.889 | | 0.159 | 0.212 | 0.295 | | 0.457 | 0.517 | 0.581 | | 0.406 | 0.521 | 0.588 |
36
+ | fio-base-v0.1 | 0.700 | 0.879 | 0.924 | | *0.279* | *0.358* | 0.462 | | *0.582* | *0.649* | 0.712 | | *0.520* | *0.629* | 0.699 |
37
 
38
 
39