hanhainebula
commited on
Commit
·
2a6c181
1
Parent(s):
3fd4e20
Update evaluation results
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- AIR-Bench_24.04/BM25/NoReranker/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/NoReranker/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/bge-reranker-v2-m3/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/bge-base-en-v1.5/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/NoReranker/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/bge-reranker-v2-m3/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/bge-large-en-v1.5/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
- AIR-Bench_24.04/bge-m3/NoReranker/results.json +0 -0
- AIR-Bench_24.04/bge-m3/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/bge-m3/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/bge-m3/bge-reranker-v2-m3/{results.json → results_20240513174259-3fc82c8b49ec8b8d89d6268dc253cc1c.json} +350 -140
- AIR-Bench_24.04/bge-m3/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/bge-m3/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/bge-m3/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/NoReranker/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/bge-reranker-v2-m3/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/bge-small-en-v1.5/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/NoReranker/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/bge-reranker-v2-m3/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/e5-mistral-7b-instruct/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/NoReranker/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/bge-reranker-v2-m3/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/jina-embeddings-v2-base-en/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/NoReranker/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/bce-reranker-base_v1/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/bge-reranker-large/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/bge-reranker-v2-m3/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/jina-reranker-v1-tiny-en/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/jina-reranker-v1-turbo-en/results.json +0 -0
- AIR-Bench_24.04/multilingual-e5-base/mmarco-mMiniLMv2-L12-H384-v1/results.json +0 -0
AIR-Bench_24.04/BM25/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/bge-reranker-v2-m3/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-base-en-v1.5/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/bge-reranker-v2-m3/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-large-en-v1.5/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-m3/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-m3/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-m3/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-m3/bge-reranker-v2-m3/{results.json → results_20240513174259-3fc82c8b49ec8b8d89d6268dc253cc1c.json}
RENAMED
@@ -4,9 +4,12 @@
|
|
4 |
"retrieval_model": "bge-m3",
|
5 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6 |
"reranking_model": "bge-reranker-v2-m3",
|
7 |
-
"reranking_model_link":
|
8 |
"task": "qa",
|
9 |
-
"metric": "ndcg_at_1"
|
|
|
|
|
|
|
10 |
},
|
11 |
"results": [
|
12 |
{
|
@@ -100,9 +103,12 @@
|
|
100 |
"retrieval_model": "bge-m3",
|
101 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
102 |
"reranking_model": "bge-reranker-v2-m3",
|
103 |
-
"reranking_model_link":
|
104 |
"task": "qa",
|
105 |
-
"metric": "ndcg_at_3"
|
|
|
|
|
|
|
106 |
},
|
107 |
"results": [
|
108 |
{
|
@@ -196,9 +202,12 @@
|
|
196 |
"retrieval_model": "bge-m3",
|
197 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
198 |
"reranking_model": "bge-reranker-v2-m3",
|
199 |
-
"reranking_model_link":
|
200 |
"task": "qa",
|
201 |
-
"metric": "ndcg_at_5"
|
|
|
|
|
|
|
202 |
},
|
203 |
"results": [
|
204 |
{
|
@@ -292,9 +301,12 @@
|
|
292 |
"retrieval_model": "bge-m3",
|
293 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
294 |
"reranking_model": "bge-reranker-v2-m3",
|
295 |
-
"reranking_model_link":
|
296 |
"task": "qa",
|
297 |
-
"metric": "ndcg_at_10"
|
|
|
|
|
|
|
298 |
},
|
299 |
"results": [
|
300 |
{
|
@@ -388,9 +400,12 @@
|
|
388 |
"retrieval_model": "bge-m3",
|
389 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
390 |
"reranking_model": "bge-reranker-v2-m3",
|
391 |
-
"reranking_model_link":
|
392 |
"task": "qa",
|
393 |
-
"metric": "ndcg_at_50"
|
|
|
|
|
|
|
394 |
},
|
395 |
"results": [
|
396 |
{
|
@@ -484,9 +499,12 @@
|
|
484 |
"retrieval_model": "bge-m3",
|
485 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
486 |
"reranking_model": "bge-reranker-v2-m3",
|
487 |
-
"reranking_model_link":
|
488 |
"task": "qa",
|
489 |
-
"metric": "ndcg_at_100"
|
|
|
|
|
|
|
490 |
},
|
491 |
"results": [
|
492 |
{
|
@@ -580,9 +598,12 @@
|
|
580 |
"retrieval_model": "bge-m3",
|
581 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
582 |
"reranking_model": "bge-reranker-v2-m3",
|
583 |
-
"reranking_model_link":
|
584 |
"task": "qa",
|
585 |
-
"metric": "ndcg_at_1000"
|
|
|
|
|
|
|
586 |
},
|
587 |
"results": [
|
588 |
{
|
@@ -676,9 +697,12 @@
|
|
676 |
"retrieval_model": "bge-m3",
|
677 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
678 |
"reranking_model": "bge-reranker-v2-m3",
|
679 |
-
"reranking_model_link":
|
680 |
"task": "qa",
|
681 |
-
"metric": "map_at_1"
|
|
|
|
|
|
|
682 |
},
|
683 |
"results": [
|
684 |
{
|
@@ -772,9 +796,12 @@
|
|
772 |
"retrieval_model": "bge-m3",
|
773 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
774 |
"reranking_model": "bge-reranker-v2-m3",
|
775 |
-
"reranking_model_link":
|
776 |
"task": "qa",
|
777 |
-
"metric": "map_at_3"
|
|
|
|
|
|
|
778 |
},
|
779 |
"results": [
|
780 |
{
|
@@ -868,9 +895,12 @@
|
|
868 |
"retrieval_model": "bge-m3",
|
869 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
870 |
"reranking_model": "bge-reranker-v2-m3",
|
871 |
-
"reranking_model_link":
|
872 |
"task": "qa",
|
873 |
-
"metric": "map_at_5"
|
|
|
|
|
|
|
874 |
},
|
875 |
"results": [
|
876 |
{
|
@@ -964,9 +994,12 @@
|
|
964 |
"retrieval_model": "bge-m3",
|
965 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
966 |
"reranking_model": "bge-reranker-v2-m3",
|
967 |
-
"reranking_model_link":
|
968 |
"task": "qa",
|
969 |
-
"metric": "map_at_10"
|
|
|
|
|
|
|
970 |
},
|
971 |
"results": [
|
972 |
{
|
@@ -1060,9 +1093,12 @@
|
|
1060 |
"retrieval_model": "bge-m3",
|
1061 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1062 |
"reranking_model": "bge-reranker-v2-m3",
|
1063 |
-
"reranking_model_link":
|
1064 |
"task": "qa",
|
1065 |
-
"metric": "map_at_50"
|
|
|
|
|
|
|
1066 |
},
|
1067 |
"results": [
|
1068 |
{
|
@@ -1156,9 +1192,12 @@
|
|
1156 |
"retrieval_model": "bge-m3",
|
1157 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1158 |
"reranking_model": "bge-reranker-v2-m3",
|
1159 |
-
"reranking_model_link":
|
1160 |
"task": "qa",
|
1161 |
-
"metric": "map_at_100"
|
|
|
|
|
|
|
1162 |
},
|
1163 |
"results": [
|
1164 |
{
|
@@ -1252,9 +1291,12 @@
|
|
1252 |
"retrieval_model": "bge-m3",
|
1253 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1254 |
"reranking_model": "bge-reranker-v2-m3",
|
1255 |
-
"reranking_model_link":
|
1256 |
"task": "qa",
|
1257 |
-
"metric": "map_at_1000"
|
|
|
|
|
|
|
1258 |
},
|
1259 |
"results": [
|
1260 |
{
|
@@ -1348,9 +1390,12 @@
|
|
1348 |
"retrieval_model": "bge-m3",
|
1349 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1350 |
"reranking_model": "bge-reranker-v2-m3",
|
1351 |
-
"reranking_model_link":
|
1352 |
"task": "qa",
|
1353 |
-
"metric": "recall_at_1"
|
|
|
|
|
|
|
1354 |
},
|
1355 |
"results": [
|
1356 |
{
|
@@ -1444,9 +1489,12 @@
|
|
1444 |
"retrieval_model": "bge-m3",
|
1445 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1446 |
"reranking_model": "bge-reranker-v2-m3",
|
1447 |
-
"reranking_model_link":
|
1448 |
"task": "qa",
|
1449 |
-
"metric": "recall_at_3"
|
|
|
|
|
|
|
1450 |
},
|
1451 |
"results": [
|
1452 |
{
|
@@ -1540,9 +1588,12 @@
|
|
1540 |
"retrieval_model": "bge-m3",
|
1541 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1542 |
"reranking_model": "bge-reranker-v2-m3",
|
1543 |
-
"reranking_model_link":
|
1544 |
"task": "qa",
|
1545 |
-
"metric": "recall_at_5"
|
|
|
|
|
|
|
1546 |
},
|
1547 |
"results": [
|
1548 |
{
|
@@ -1636,9 +1687,12 @@
|
|
1636 |
"retrieval_model": "bge-m3",
|
1637 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1638 |
"reranking_model": "bge-reranker-v2-m3",
|
1639 |
-
"reranking_model_link":
|
1640 |
"task": "qa",
|
1641 |
-
"metric": "recall_at_10"
|
|
|
|
|
|
|
1642 |
},
|
1643 |
"results": [
|
1644 |
{
|
@@ -1732,9 +1786,12 @@
|
|
1732 |
"retrieval_model": "bge-m3",
|
1733 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1734 |
"reranking_model": "bge-reranker-v2-m3",
|
1735 |
-
"reranking_model_link":
|
1736 |
"task": "qa",
|
1737 |
-
"metric": "recall_at_50"
|
|
|
|
|
|
|
1738 |
},
|
1739 |
"results": [
|
1740 |
{
|
@@ -1828,9 +1885,12 @@
|
|
1828 |
"retrieval_model": "bge-m3",
|
1829 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1830 |
"reranking_model": "bge-reranker-v2-m3",
|
1831 |
-
"reranking_model_link":
|
1832 |
"task": "qa",
|
1833 |
-
"metric": "recall_at_100"
|
|
|
|
|
|
|
1834 |
},
|
1835 |
"results": [
|
1836 |
{
|
@@ -1924,9 +1984,12 @@
|
|
1924 |
"retrieval_model": "bge-m3",
|
1925 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1926 |
"reranking_model": "bge-reranker-v2-m3",
|
1927 |
-
"reranking_model_link":
|
1928 |
"task": "qa",
|
1929 |
-
"metric": "recall_at_1000"
|
|
|
|
|
|
|
1930 |
},
|
1931 |
"results": [
|
1932 |
{
|
@@ -2020,9 +2083,12 @@
|
|
2020 |
"retrieval_model": "bge-m3",
|
2021 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2022 |
"reranking_model": "bge-reranker-v2-m3",
|
2023 |
-
"reranking_model_link":
|
2024 |
"task": "qa",
|
2025 |
-
"metric": "precision_at_1"
|
|
|
|
|
|
|
2026 |
},
|
2027 |
"results": [
|
2028 |
{
|
@@ -2116,9 +2182,12 @@
|
|
2116 |
"retrieval_model": "bge-m3",
|
2117 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2118 |
"reranking_model": "bge-reranker-v2-m3",
|
2119 |
-
"reranking_model_link":
|
2120 |
"task": "qa",
|
2121 |
-
"metric": "precision_at_3"
|
|
|
|
|
|
|
2122 |
},
|
2123 |
"results": [
|
2124 |
{
|
@@ -2212,9 +2281,12 @@
|
|
2212 |
"retrieval_model": "bge-m3",
|
2213 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2214 |
"reranking_model": "bge-reranker-v2-m3",
|
2215 |
-
"reranking_model_link":
|
2216 |
"task": "qa",
|
2217 |
-
"metric": "precision_at_5"
|
|
|
|
|
|
|
2218 |
},
|
2219 |
"results": [
|
2220 |
{
|
@@ -2308,9 +2380,12 @@
|
|
2308 |
"retrieval_model": "bge-m3",
|
2309 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2310 |
"reranking_model": "bge-reranker-v2-m3",
|
2311 |
-
"reranking_model_link":
|
2312 |
"task": "qa",
|
2313 |
-
"metric": "precision_at_10"
|
|
|
|
|
|
|
2314 |
},
|
2315 |
"results": [
|
2316 |
{
|
@@ -2404,9 +2479,12 @@
|
|
2404 |
"retrieval_model": "bge-m3",
|
2405 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2406 |
"reranking_model": "bge-reranker-v2-m3",
|
2407 |
-
"reranking_model_link":
|
2408 |
"task": "qa",
|
2409 |
-
"metric": "precision_at_50"
|
|
|
|
|
|
|
2410 |
},
|
2411 |
"results": [
|
2412 |
{
|
@@ -2500,9 +2578,12 @@
|
|
2500 |
"retrieval_model": "bge-m3",
|
2501 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2502 |
"reranking_model": "bge-reranker-v2-m3",
|
2503 |
-
"reranking_model_link":
|
2504 |
"task": "qa",
|
2505 |
-
"metric": "precision_at_100"
|
|
|
|
|
|
|
2506 |
},
|
2507 |
"results": [
|
2508 |
{
|
@@ -2596,9 +2677,12 @@
|
|
2596 |
"retrieval_model": "bge-m3",
|
2597 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2598 |
"reranking_model": "bge-reranker-v2-m3",
|
2599 |
-
"reranking_model_link":
|
2600 |
"task": "qa",
|
2601 |
-
"metric": "precision_at_1000"
|
|
|
|
|
|
|
2602 |
},
|
2603 |
"results": [
|
2604 |
{
|
@@ -2692,9 +2776,12 @@
|
|
2692 |
"retrieval_model": "bge-m3",
|
2693 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2694 |
"reranking_model": "bge-reranker-v2-m3",
|
2695 |
-
"reranking_model_link":
|
2696 |
"task": "qa",
|
2697 |
-
"metric": "mrr_at_1"
|
|
|
|
|
|
|
2698 |
},
|
2699 |
"results": [
|
2700 |
{
|
@@ -2788,9 +2875,12 @@
|
|
2788 |
"retrieval_model": "bge-m3",
|
2789 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2790 |
"reranking_model": "bge-reranker-v2-m3",
|
2791 |
-
"reranking_model_link":
|
2792 |
"task": "qa",
|
2793 |
-
"metric": "mrr_at_3"
|
|
|
|
|
|
|
2794 |
},
|
2795 |
"results": [
|
2796 |
{
|
@@ -2884,9 +2974,12 @@
|
|
2884 |
"retrieval_model": "bge-m3",
|
2885 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2886 |
"reranking_model": "bge-reranker-v2-m3",
|
2887 |
-
"reranking_model_link":
|
2888 |
"task": "qa",
|
2889 |
-
"metric": "mrr_at_5"
|
|
|
|
|
|
|
2890 |
},
|
2891 |
"results": [
|
2892 |
{
|
@@ -2980,9 +3073,12 @@
|
|
2980 |
"retrieval_model": "bge-m3",
|
2981 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2982 |
"reranking_model": "bge-reranker-v2-m3",
|
2983 |
-
"reranking_model_link":
|
2984 |
"task": "qa",
|
2985 |
-
"metric": "mrr_at_10"
|
|
|
|
|
|
|
2986 |
},
|
2987 |
"results": [
|
2988 |
{
|
@@ -3076,9 +3172,12 @@
|
|
3076 |
"retrieval_model": "bge-m3",
|
3077 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3078 |
"reranking_model": "bge-reranker-v2-m3",
|
3079 |
-
"reranking_model_link":
|
3080 |
"task": "qa",
|
3081 |
-
"metric": "mrr_at_50"
|
|
|
|
|
|
|
3082 |
},
|
3083 |
"results": [
|
3084 |
{
|
@@ -3172,9 +3271,12 @@
|
|
3172 |
"retrieval_model": "bge-m3",
|
3173 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3174 |
"reranking_model": "bge-reranker-v2-m3",
|
3175 |
-
"reranking_model_link":
|
3176 |
"task": "qa",
|
3177 |
-
"metric": "mrr_at_100"
|
|
|
|
|
|
|
3178 |
},
|
3179 |
"results": [
|
3180 |
{
|
@@ -3268,9 +3370,12 @@
|
|
3268 |
"retrieval_model": "bge-m3",
|
3269 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3270 |
"reranking_model": "bge-reranker-v2-m3",
|
3271 |
-
"reranking_model_link":
|
3272 |
"task": "qa",
|
3273 |
-
"metric": "mrr_at_1000"
|
|
|
|
|
|
|
3274 |
},
|
3275 |
"results": [
|
3276 |
{
|
@@ -3364,9 +3469,12 @@
|
|
3364 |
"retrieval_model": "bge-m3",
|
3365 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3366 |
"reranking_model": "bge-reranker-v2-m3",
|
3367 |
-
"reranking_model_link":
|
3368 |
"task": "long-doc",
|
3369 |
-
"metric": "ndcg_at_1"
|
|
|
|
|
|
|
3370 |
},
|
3371 |
"results": [
|
3372 |
{
|
@@ -3466,9 +3574,12 @@
|
|
3466 |
"retrieval_model": "bge-m3",
|
3467 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3468 |
"reranking_model": "bge-reranker-v2-m3",
|
3469 |
-
"reranking_model_link":
|
3470 |
"task": "long-doc",
|
3471 |
-
"metric": "ndcg_at_3"
|
|
|
|
|
|
|
3472 |
},
|
3473 |
"results": [
|
3474 |
{
|
@@ -3568,9 +3679,12 @@
|
|
3568 |
"retrieval_model": "bge-m3",
|
3569 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3570 |
"reranking_model": "bge-reranker-v2-m3",
|
3571 |
-
"reranking_model_link":
|
3572 |
"task": "long-doc",
|
3573 |
-
"metric": "ndcg_at_5"
|
|
|
|
|
|
|
3574 |
},
|
3575 |
"results": [
|
3576 |
{
|
@@ -3670,9 +3784,12 @@
|
|
3670 |
"retrieval_model": "bge-m3",
|
3671 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3672 |
"reranking_model": "bge-reranker-v2-m3",
|
3673 |
-
"reranking_model_link":
|
3674 |
"task": "long-doc",
|
3675 |
-
"metric": "ndcg_at_10"
|
|
|
|
|
|
|
3676 |
},
|
3677 |
"results": [
|
3678 |
{
|
@@ -3772,9 +3889,12 @@
|
|
3772 |
"retrieval_model": "bge-m3",
|
3773 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3774 |
"reranking_model": "bge-reranker-v2-m3",
|
3775 |
-
"reranking_model_link":
|
3776 |
"task": "long-doc",
|
3777 |
-
"metric": "ndcg_at_50"
|
|
|
|
|
|
|
3778 |
},
|
3779 |
"results": [
|
3780 |
{
|
@@ -3874,9 +3994,12 @@
|
|
3874 |
"retrieval_model": "bge-m3",
|
3875 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3876 |
"reranking_model": "bge-reranker-v2-m3",
|
3877 |
-
"reranking_model_link":
|
3878 |
"task": "long-doc",
|
3879 |
-
"metric": "ndcg_at_100"
|
|
|
|
|
|
|
3880 |
},
|
3881 |
"results": [
|
3882 |
{
|
@@ -3976,9 +4099,12 @@
|
|
3976 |
"retrieval_model": "bge-m3",
|
3977 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3978 |
"reranking_model": "bge-reranker-v2-m3",
|
3979 |
-
"reranking_model_link":
|
3980 |
"task": "long-doc",
|
3981 |
-
"metric": "ndcg_at_1000"
|
|
|
|
|
|
|
3982 |
},
|
3983 |
"results": [
|
3984 |
{
|
@@ -4078,9 +4204,12 @@
|
|
4078 |
"retrieval_model": "bge-m3",
|
4079 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4080 |
"reranking_model": "bge-reranker-v2-m3",
|
4081 |
-
"reranking_model_link":
|
4082 |
"task": "long-doc",
|
4083 |
-
"metric": "map_at_1"
|
|
|
|
|
|
|
4084 |
},
|
4085 |
"results": [
|
4086 |
{
|
@@ -4180,9 +4309,12 @@
|
|
4180 |
"retrieval_model": "bge-m3",
|
4181 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4182 |
"reranking_model": "bge-reranker-v2-m3",
|
4183 |
-
"reranking_model_link":
|
4184 |
"task": "long-doc",
|
4185 |
-
"metric": "map_at_3"
|
|
|
|
|
|
|
4186 |
},
|
4187 |
"results": [
|
4188 |
{
|
@@ -4282,9 +4414,12 @@
|
|
4282 |
"retrieval_model": "bge-m3",
|
4283 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4284 |
"reranking_model": "bge-reranker-v2-m3",
|
4285 |
-
"reranking_model_link":
|
4286 |
"task": "long-doc",
|
4287 |
-
"metric": "map_at_5"
|
|
|
|
|
|
|
4288 |
},
|
4289 |
"results": [
|
4290 |
{
|
@@ -4384,9 +4519,12 @@
|
|
4384 |
"retrieval_model": "bge-m3",
|
4385 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4386 |
"reranking_model": "bge-reranker-v2-m3",
|
4387 |
-
"reranking_model_link":
|
4388 |
"task": "long-doc",
|
4389 |
-
"metric": "map_at_10"
|
|
|
|
|
|
|
4390 |
},
|
4391 |
"results": [
|
4392 |
{
|
@@ -4486,9 +4624,12 @@
|
|
4486 |
"retrieval_model": "bge-m3",
|
4487 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4488 |
"reranking_model": "bge-reranker-v2-m3",
|
4489 |
-
"reranking_model_link":
|
4490 |
"task": "long-doc",
|
4491 |
-
"metric": "map_at_50"
|
|
|
|
|
|
|
4492 |
},
|
4493 |
"results": [
|
4494 |
{
|
@@ -4588,9 +4729,12 @@
|
|
4588 |
"retrieval_model": "bge-m3",
|
4589 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4590 |
"reranking_model": "bge-reranker-v2-m3",
|
4591 |
-
"reranking_model_link":
|
4592 |
"task": "long-doc",
|
4593 |
-
"metric": "map_at_100"
|
|
|
|
|
|
|
4594 |
},
|
4595 |
"results": [
|
4596 |
{
|
@@ -4690,9 +4834,12 @@
|
|
4690 |
"retrieval_model": "bge-m3",
|
4691 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4692 |
"reranking_model": "bge-reranker-v2-m3",
|
4693 |
-
"reranking_model_link":
|
4694 |
"task": "long-doc",
|
4695 |
-
"metric": "map_at_1000"
|
|
|
|
|
|
|
4696 |
},
|
4697 |
"results": [
|
4698 |
{
|
@@ -4792,9 +4939,12 @@
|
|
4792 |
"retrieval_model": "bge-m3",
|
4793 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4794 |
"reranking_model": "bge-reranker-v2-m3",
|
4795 |
-
"reranking_model_link":
|
4796 |
"task": "long-doc",
|
4797 |
-
"metric": "recall_at_1"
|
|
|
|
|
|
|
4798 |
},
|
4799 |
"results": [
|
4800 |
{
|
@@ -4894,9 +5044,12 @@
|
|
4894 |
"retrieval_model": "bge-m3",
|
4895 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4896 |
"reranking_model": "bge-reranker-v2-m3",
|
4897 |
-
"reranking_model_link":
|
4898 |
"task": "long-doc",
|
4899 |
-
"metric": "recall_at_3"
|
|
|
|
|
|
|
4900 |
},
|
4901 |
"results": [
|
4902 |
{
|
@@ -4996,9 +5149,12 @@
|
|
4996 |
"retrieval_model": "bge-m3",
|
4997 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4998 |
"reranking_model": "bge-reranker-v2-m3",
|
4999 |
-
"reranking_model_link":
|
5000 |
"task": "long-doc",
|
5001 |
-
"metric": "recall_at_5"
|
|
|
|
|
|
|
5002 |
},
|
5003 |
"results": [
|
5004 |
{
|
@@ -5098,9 +5254,12 @@
|
|
5098 |
"retrieval_model": "bge-m3",
|
5099 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5100 |
"reranking_model": "bge-reranker-v2-m3",
|
5101 |
-
"reranking_model_link":
|
5102 |
"task": "long-doc",
|
5103 |
-
"metric": "recall_at_10"
|
|
|
|
|
|
|
5104 |
},
|
5105 |
"results": [
|
5106 |
{
|
@@ -5200,9 +5359,12 @@
|
|
5200 |
"retrieval_model": "bge-m3",
|
5201 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5202 |
"reranking_model": "bge-reranker-v2-m3",
|
5203 |
-
"reranking_model_link":
|
5204 |
"task": "long-doc",
|
5205 |
-
"metric": "recall_at_50"
|
|
|
|
|
|
|
5206 |
},
|
5207 |
"results": [
|
5208 |
{
|
@@ -5302,9 +5464,12 @@
|
|
5302 |
"retrieval_model": "bge-m3",
|
5303 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5304 |
"reranking_model": "bge-reranker-v2-m3",
|
5305 |
-
"reranking_model_link":
|
5306 |
"task": "long-doc",
|
5307 |
-
"metric": "recall_at_100"
|
|
|
|
|
|
|
5308 |
},
|
5309 |
"results": [
|
5310 |
{
|
@@ -5404,9 +5569,12 @@
|
|
5404 |
"retrieval_model": "bge-m3",
|
5405 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5406 |
"reranking_model": "bge-reranker-v2-m3",
|
5407 |
-
"reranking_model_link":
|
5408 |
"task": "long-doc",
|
5409 |
-
"metric": "recall_at_1000"
|
|
|
|
|
|
|
5410 |
},
|
5411 |
"results": [
|
5412 |
{
|
@@ -5506,9 +5674,12 @@
|
|
5506 |
"retrieval_model": "bge-m3",
|
5507 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5508 |
"reranking_model": "bge-reranker-v2-m3",
|
5509 |
-
"reranking_model_link":
|
5510 |
"task": "long-doc",
|
5511 |
-
"metric": "precision_at_1"
|
|
|
|
|
|
|
5512 |
},
|
5513 |
"results": [
|
5514 |
{
|
@@ -5608,9 +5779,12 @@
|
|
5608 |
"retrieval_model": "bge-m3",
|
5609 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5610 |
"reranking_model": "bge-reranker-v2-m3",
|
5611 |
-
"reranking_model_link":
|
5612 |
"task": "long-doc",
|
5613 |
-
"metric": "precision_at_3"
|
|
|
|
|
|
|
5614 |
},
|
5615 |
"results": [
|
5616 |
{
|
@@ -5710,9 +5884,12 @@
|
|
5710 |
"retrieval_model": "bge-m3",
|
5711 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5712 |
"reranking_model": "bge-reranker-v2-m3",
|
5713 |
-
"reranking_model_link":
|
5714 |
"task": "long-doc",
|
5715 |
-
"metric": "precision_at_5"
|
|
|
|
|
|
|
5716 |
},
|
5717 |
"results": [
|
5718 |
{
|
@@ -5812,9 +5989,12 @@
|
|
5812 |
"retrieval_model": "bge-m3",
|
5813 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5814 |
"reranking_model": "bge-reranker-v2-m3",
|
5815 |
-
"reranking_model_link":
|
5816 |
"task": "long-doc",
|
5817 |
-
"metric": "precision_at_10"
|
|
|
|
|
|
|
5818 |
},
|
5819 |
"results": [
|
5820 |
{
|
@@ -5914,9 +6094,12 @@
|
|
5914 |
"retrieval_model": "bge-m3",
|
5915 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5916 |
"reranking_model": "bge-reranker-v2-m3",
|
5917 |
-
"reranking_model_link":
|
5918 |
"task": "long-doc",
|
5919 |
-
"metric": "precision_at_50"
|
|
|
|
|
|
|
5920 |
},
|
5921 |
"results": [
|
5922 |
{
|
@@ -6016,9 +6199,12 @@
|
|
6016 |
"retrieval_model": "bge-m3",
|
6017 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6018 |
"reranking_model": "bge-reranker-v2-m3",
|
6019 |
-
"reranking_model_link":
|
6020 |
"task": "long-doc",
|
6021 |
-
"metric": "precision_at_100"
|
|
|
|
|
|
|
6022 |
},
|
6023 |
"results": [
|
6024 |
{
|
@@ -6118,9 +6304,12 @@
|
|
6118 |
"retrieval_model": "bge-m3",
|
6119 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6120 |
"reranking_model": "bge-reranker-v2-m3",
|
6121 |
-
"reranking_model_link":
|
6122 |
"task": "long-doc",
|
6123 |
-
"metric": "precision_at_1000"
|
|
|
|
|
|
|
6124 |
},
|
6125 |
"results": [
|
6126 |
{
|
@@ -6220,9 +6409,12 @@
|
|
6220 |
"retrieval_model": "bge-m3",
|
6221 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6222 |
"reranking_model": "bge-reranker-v2-m3",
|
6223 |
-
"reranking_model_link":
|
6224 |
"task": "long-doc",
|
6225 |
-
"metric": "mrr_at_1"
|
|
|
|
|
|
|
6226 |
},
|
6227 |
"results": [
|
6228 |
{
|
@@ -6322,9 +6514,12 @@
|
|
6322 |
"retrieval_model": "bge-m3",
|
6323 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6324 |
"reranking_model": "bge-reranker-v2-m3",
|
6325 |
-
"reranking_model_link":
|
6326 |
"task": "long-doc",
|
6327 |
-
"metric": "mrr_at_3"
|
|
|
|
|
|
|
6328 |
},
|
6329 |
"results": [
|
6330 |
{
|
@@ -6424,9 +6619,12 @@
|
|
6424 |
"retrieval_model": "bge-m3",
|
6425 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6426 |
"reranking_model": "bge-reranker-v2-m3",
|
6427 |
-
"reranking_model_link":
|
6428 |
"task": "long-doc",
|
6429 |
-
"metric": "mrr_at_5"
|
|
|
|
|
|
|
6430 |
},
|
6431 |
"results": [
|
6432 |
{
|
@@ -6526,9 +6724,12 @@
|
|
6526 |
"retrieval_model": "bge-m3",
|
6527 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6528 |
"reranking_model": "bge-reranker-v2-m3",
|
6529 |
-
"reranking_model_link":
|
6530 |
"task": "long-doc",
|
6531 |
-
"metric": "mrr_at_10"
|
|
|
|
|
|
|
6532 |
},
|
6533 |
"results": [
|
6534 |
{
|
@@ -6628,9 +6829,12 @@
|
|
6628 |
"retrieval_model": "bge-m3",
|
6629 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6630 |
"reranking_model": "bge-reranker-v2-m3",
|
6631 |
-
"reranking_model_link":
|
6632 |
"task": "long-doc",
|
6633 |
-
"metric": "mrr_at_50"
|
|
|
|
|
|
|
6634 |
},
|
6635 |
"results": [
|
6636 |
{
|
@@ -6730,9 +6934,12 @@
|
|
6730 |
"retrieval_model": "bge-m3",
|
6731 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6732 |
"reranking_model": "bge-reranker-v2-m3",
|
6733 |
-
"reranking_model_link":
|
6734 |
"task": "long-doc",
|
6735 |
-
"metric": "mrr_at_100"
|
|
|
|
|
|
|
6736 |
},
|
6737 |
"results": [
|
6738 |
{
|
@@ -6832,9 +7039,12 @@
|
|
6832 |
"retrieval_model": "bge-m3",
|
6833 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6834 |
"reranking_model": "bge-reranker-v2-m3",
|
6835 |
-
"reranking_model_link":
|
6836 |
"task": "long-doc",
|
6837 |
-
"metric": "mrr_at_1000"
|
|
|
|
|
|
|
6838 |
},
|
6839 |
"results": [
|
6840 |
{
|
|
|
4 |
"retrieval_model": "bge-m3",
|
5 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6 |
"reranking_model": "bge-reranker-v2-m3",
|
7 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
8 |
"task": "qa",
|
9 |
+
"metric": "ndcg_at_1",
|
10 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
11 |
+
"is_anonymous": "False",
|
12 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
13 |
},
|
14 |
"results": [
|
15 |
{
|
|
|
103 |
"retrieval_model": "bge-m3",
|
104 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
105 |
"reranking_model": "bge-reranker-v2-m3",
|
106 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
107 |
"task": "qa",
|
108 |
+
"metric": "ndcg_at_3",
|
109 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
110 |
+
"is_anonymous": "False",
|
111 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
112 |
},
|
113 |
"results": [
|
114 |
{
|
|
|
202 |
"retrieval_model": "bge-m3",
|
203 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
204 |
"reranking_model": "bge-reranker-v2-m3",
|
205 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
206 |
"task": "qa",
|
207 |
+
"metric": "ndcg_at_5",
|
208 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
209 |
+
"is_anonymous": "False",
|
210 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
211 |
},
|
212 |
"results": [
|
213 |
{
|
|
|
301 |
"retrieval_model": "bge-m3",
|
302 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
303 |
"reranking_model": "bge-reranker-v2-m3",
|
304 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
305 |
"task": "qa",
|
306 |
+
"metric": "ndcg_at_10",
|
307 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
308 |
+
"is_anonymous": "False",
|
309 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
310 |
},
|
311 |
"results": [
|
312 |
{
|
|
|
400 |
"retrieval_model": "bge-m3",
|
401 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
402 |
"reranking_model": "bge-reranker-v2-m3",
|
403 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
404 |
"task": "qa",
|
405 |
+
"metric": "ndcg_at_50",
|
406 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
407 |
+
"is_anonymous": "False",
|
408 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
409 |
},
|
410 |
"results": [
|
411 |
{
|
|
|
499 |
"retrieval_model": "bge-m3",
|
500 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
501 |
"reranking_model": "bge-reranker-v2-m3",
|
502 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
503 |
"task": "qa",
|
504 |
+
"metric": "ndcg_at_100",
|
505 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
506 |
+
"is_anonymous": "False",
|
507 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
508 |
},
|
509 |
"results": [
|
510 |
{
|
|
|
598 |
"retrieval_model": "bge-m3",
|
599 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
600 |
"reranking_model": "bge-reranker-v2-m3",
|
601 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
602 |
"task": "qa",
|
603 |
+
"metric": "ndcg_at_1000",
|
604 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
605 |
+
"is_anonymous": "False",
|
606 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
607 |
},
|
608 |
"results": [
|
609 |
{
|
|
|
697 |
"retrieval_model": "bge-m3",
|
698 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
699 |
"reranking_model": "bge-reranker-v2-m3",
|
700 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
701 |
"task": "qa",
|
702 |
+
"metric": "map_at_1",
|
703 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
704 |
+
"is_anonymous": "False",
|
705 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
706 |
},
|
707 |
"results": [
|
708 |
{
|
|
|
796 |
"retrieval_model": "bge-m3",
|
797 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
798 |
"reranking_model": "bge-reranker-v2-m3",
|
799 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
800 |
"task": "qa",
|
801 |
+
"metric": "map_at_3",
|
802 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
803 |
+
"is_anonymous": "False",
|
804 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
805 |
},
|
806 |
"results": [
|
807 |
{
|
|
|
895 |
"retrieval_model": "bge-m3",
|
896 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
897 |
"reranking_model": "bge-reranker-v2-m3",
|
898 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
899 |
"task": "qa",
|
900 |
+
"metric": "map_at_5",
|
901 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
902 |
+
"is_anonymous": "False",
|
903 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
904 |
},
|
905 |
"results": [
|
906 |
{
|
|
|
994 |
"retrieval_model": "bge-m3",
|
995 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
996 |
"reranking_model": "bge-reranker-v2-m3",
|
997 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
998 |
"task": "qa",
|
999 |
+
"metric": "map_at_10",
|
1000 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1001 |
+
"is_anonymous": "False",
|
1002 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1003 |
},
|
1004 |
"results": [
|
1005 |
{
|
|
|
1093 |
"retrieval_model": "bge-m3",
|
1094 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1095 |
"reranking_model": "bge-reranker-v2-m3",
|
1096 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1097 |
"task": "qa",
|
1098 |
+
"metric": "map_at_50",
|
1099 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1100 |
+
"is_anonymous": "False",
|
1101 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1102 |
},
|
1103 |
"results": [
|
1104 |
{
|
|
|
1192 |
"retrieval_model": "bge-m3",
|
1193 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1194 |
"reranking_model": "bge-reranker-v2-m3",
|
1195 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1196 |
"task": "qa",
|
1197 |
+
"metric": "map_at_100",
|
1198 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1199 |
+
"is_anonymous": "False",
|
1200 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1201 |
},
|
1202 |
"results": [
|
1203 |
{
|
|
|
1291 |
"retrieval_model": "bge-m3",
|
1292 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1293 |
"reranking_model": "bge-reranker-v2-m3",
|
1294 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1295 |
"task": "qa",
|
1296 |
+
"metric": "map_at_1000",
|
1297 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1298 |
+
"is_anonymous": "False",
|
1299 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1300 |
},
|
1301 |
"results": [
|
1302 |
{
|
|
|
1390 |
"retrieval_model": "bge-m3",
|
1391 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1392 |
"reranking_model": "bge-reranker-v2-m3",
|
1393 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1394 |
"task": "qa",
|
1395 |
+
"metric": "recall_at_1",
|
1396 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1397 |
+
"is_anonymous": "False",
|
1398 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1399 |
},
|
1400 |
"results": [
|
1401 |
{
|
|
|
1489 |
"retrieval_model": "bge-m3",
|
1490 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1491 |
"reranking_model": "bge-reranker-v2-m3",
|
1492 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1493 |
"task": "qa",
|
1494 |
+
"metric": "recall_at_3",
|
1495 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1496 |
+
"is_anonymous": "False",
|
1497 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1498 |
},
|
1499 |
"results": [
|
1500 |
{
|
|
|
1588 |
"retrieval_model": "bge-m3",
|
1589 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1590 |
"reranking_model": "bge-reranker-v2-m3",
|
1591 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1592 |
"task": "qa",
|
1593 |
+
"metric": "recall_at_5",
|
1594 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1595 |
+
"is_anonymous": "False",
|
1596 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1597 |
},
|
1598 |
"results": [
|
1599 |
{
|
|
|
1687 |
"retrieval_model": "bge-m3",
|
1688 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1689 |
"reranking_model": "bge-reranker-v2-m3",
|
1690 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1691 |
"task": "qa",
|
1692 |
+
"metric": "recall_at_10",
|
1693 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1694 |
+
"is_anonymous": "False",
|
1695 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1696 |
},
|
1697 |
"results": [
|
1698 |
{
|
|
|
1786 |
"retrieval_model": "bge-m3",
|
1787 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1788 |
"reranking_model": "bge-reranker-v2-m3",
|
1789 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1790 |
"task": "qa",
|
1791 |
+
"metric": "recall_at_50",
|
1792 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1793 |
+
"is_anonymous": "False",
|
1794 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1795 |
},
|
1796 |
"results": [
|
1797 |
{
|
|
|
1885 |
"retrieval_model": "bge-m3",
|
1886 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1887 |
"reranking_model": "bge-reranker-v2-m3",
|
1888 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1889 |
"task": "qa",
|
1890 |
+
"metric": "recall_at_100",
|
1891 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1892 |
+
"is_anonymous": "False",
|
1893 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1894 |
},
|
1895 |
"results": [
|
1896 |
{
|
|
|
1984 |
"retrieval_model": "bge-m3",
|
1985 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
1986 |
"reranking_model": "bge-reranker-v2-m3",
|
1987 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
1988 |
"task": "qa",
|
1989 |
+
"metric": "recall_at_1000",
|
1990 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
1991 |
+
"is_anonymous": "False",
|
1992 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
1993 |
},
|
1994 |
"results": [
|
1995 |
{
|
|
|
2083 |
"retrieval_model": "bge-m3",
|
2084 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2085 |
"reranking_model": "bge-reranker-v2-m3",
|
2086 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2087 |
"task": "qa",
|
2088 |
+
"metric": "precision_at_1",
|
2089 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2090 |
+
"is_anonymous": "False",
|
2091 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2092 |
},
|
2093 |
"results": [
|
2094 |
{
|
|
|
2182 |
"retrieval_model": "bge-m3",
|
2183 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2184 |
"reranking_model": "bge-reranker-v2-m3",
|
2185 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2186 |
"task": "qa",
|
2187 |
+
"metric": "precision_at_3",
|
2188 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2189 |
+
"is_anonymous": "False",
|
2190 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2191 |
},
|
2192 |
"results": [
|
2193 |
{
|
|
|
2281 |
"retrieval_model": "bge-m3",
|
2282 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2283 |
"reranking_model": "bge-reranker-v2-m3",
|
2284 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2285 |
"task": "qa",
|
2286 |
+
"metric": "precision_at_5",
|
2287 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2288 |
+
"is_anonymous": "False",
|
2289 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2290 |
},
|
2291 |
"results": [
|
2292 |
{
|
|
|
2380 |
"retrieval_model": "bge-m3",
|
2381 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2382 |
"reranking_model": "bge-reranker-v2-m3",
|
2383 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2384 |
"task": "qa",
|
2385 |
+
"metric": "precision_at_10",
|
2386 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2387 |
+
"is_anonymous": "False",
|
2388 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2389 |
},
|
2390 |
"results": [
|
2391 |
{
|
|
|
2479 |
"retrieval_model": "bge-m3",
|
2480 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2481 |
"reranking_model": "bge-reranker-v2-m3",
|
2482 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2483 |
"task": "qa",
|
2484 |
+
"metric": "precision_at_50",
|
2485 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2486 |
+
"is_anonymous": "False",
|
2487 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2488 |
},
|
2489 |
"results": [
|
2490 |
{
|
|
|
2578 |
"retrieval_model": "bge-m3",
|
2579 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2580 |
"reranking_model": "bge-reranker-v2-m3",
|
2581 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2582 |
"task": "qa",
|
2583 |
+
"metric": "precision_at_100",
|
2584 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2585 |
+
"is_anonymous": "False",
|
2586 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2587 |
},
|
2588 |
"results": [
|
2589 |
{
|
|
|
2677 |
"retrieval_model": "bge-m3",
|
2678 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2679 |
"reranking_model": "bge-reranker-v2-m3",
|
2680 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2681 |
"task": "qa",
|
2682 |
+
"metric": "precision_at_1000",
|
2683 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2684 |
+
"is_anonymous": "False",
|
2685 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2686 |
},
|
2687 |
"results": [
|
2688 |
{
|
|
|
2776 |
"retrieval_model": "bge-m3",
|
2777 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2778 |
"reranking_model": "bge-reranker-v2-m3",
|
2779 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2780 |
"task": "qa",
|
2781 |
+
"metric": "mrr_at_1",
|
2782 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2783 |
+
"is_anonymous": "False",
|
2784 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2785 |
},
|
2786 |
"results": [
|
2787 |
{
|
|
|
2875 |
"retrieval_model": "bge-m3",
|
2876 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2877 |
"reranking_model": "bge-reranker-v2-m3",
|
2878 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2879 |
"task": "qa",
|
2880 |
+
"metric": "mrr_at_3",
|
2881 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2882 |
+
"is_anonymous": "False",
|
2883 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2884 |
},
|
2885 |
"results": [
|
2886 |
{
|
|
|
2974 |
"retrieval_model": "bge-m3",
|
2975 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
2976 |
"reranking_model": "bge-reranker-v2-m3",
|
2977 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
2978 |
"task": "qa",
|
2979 |
+
"metric": "mrr_at_5",
|
2980 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
2981 |
+
"is_anonymous": "False",
|
2982 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
2983 |
},
|
2984 |
"results": [
|
2985 |
{
|
|
|
3073 |
"retrieval_model": "bge-m3",
|
3074 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3075 |
"reranking_model": "bge-reranker-v2-m3",
|
3076 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3077 |
"task": "qa",
|
3078 |
+
"metric": "mrr_at_10",
|
3079 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3080 |
+
"is_anonymous": "False",
|
3081 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3082 |
},
|
3083 |
"results": [
|
3084 |
{
|
|
|
3172 |
"retrieval_model": "bge-m3",
|
3173 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3174 |
"reranking_model": "bge-reranker-v2-m3",
|
3175 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3176 |
"task": "qa",
|
3177 |
+
"metric": "mrr_at_50",
|
3178 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3179 |
+
"is_anonymous": "False",
|
3180 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3181 |
},
|
3182 |
"results": [
|
3183 |
{
|
|
|
3271 |
"retrieval_model": "bge-m3",
|
3272 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3273 |
"reranking_model": "bge-reranker-v2-m3",
|
3274 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3275 |
"task": "qa",
|
3276 |
+
"metric": "mrr_at_100",
|
3277 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3278 |
+
"is_anonymous": "False",
|
3279 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3280 |
},
|
3281 |
"results": [
|
3282 |
{
|
|
|
3370 |
"retrieval_model": "bge-m3",
|
3371 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3372 |
"reranking_model": "bge-reranker-v2-m3",
|
3373 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3374 |
"task": "qa",
|
3375 |
+
"metric": "mrr_at_1000",
|
3376 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3377 |
+
"is_anonymous": "False",
|
3378 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3379 |
},
|
3380 |
"results": [
|
3381 |
{
|
|
|
3469 |
"retrieval_model": "bge-m3",
|
3470 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3471 |
"reranking_model": "bge-reranker-v2-m3",
|
3472 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3473 |
"task": "long-doc",
|
3474 |
+
"metric": "ndcg_at_1",
|
3475 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3476 |
+
"is_anonymous": "False",
|
3477 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3478 |
},
|
3479 |
"results": [
|
3480 |
{
|
|
|
3574 |
"retrieval_model": "bge-m3",
|
3575 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3576 |
"reranking_model": "bge-reranker-v2-m3",
|
3577 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3578 |
"task": "long-doc",
|
3579 |
+
"metric": "ndcg_at_3",
|
3580 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3581 |
+
"is_anonymous": "False",
|
3582 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3583 |
},
|
3584 |
"results": [
|
3585 |
{
|
|
|
3679 |
"retrieval_model": "bge-m3",
|
3680 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3681 |
"reranking_model": "bge-reranker-v2-m3",
|
3682 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3683 |
"task": "long-doc",
|
3684 |
+
"metric": "ndcg_at_5",
|
3685 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3686 |
+
"is_anonymous": "False",
|
3687 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3688 |
},
|
3689 |
"results": [
|
3690 |
{
|
|
|
3784 |
"retrieval_model": "bge-m3",
|
3785 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3786 |
"reranking_model": "bge-reranker-v2-m3",
|
3787 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3788 |
"task": "long-doc",
|
3789 |
+
"metric": "ndcg_at_10",
|
3790 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3791 |
+
"is_anonymous": "False",
|
3792 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3793 |
},
|
3794 |
"results": [
|
3795 |
{
|
|
|
3889 |
"retrieval_model": "bge-m3",
|
3890 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3891 |
"reranking_model": "bge-reranker-v2-m3",
|
3892 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3893 |
"task": "long-doc",
|
3894 |
+
"metric": "ndcg_at_50",
|
3895 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
3896 |
+
"is_anonymous": "False",
|
3897 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
3898 |
},
|
3899 |
"results": [
|
3900 |
{
|
|
|
3994 |
"retrieval_model": "bge-m3",
|
3995 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
3996 |
"reranking_model": "bge-reranker-v2-m3",
|
3997 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
3998 |
"task": "long-doc",
|
3999 |
+
"metric": "ndcg_at_100",
|
4000 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4001 |
+
"is_anonymous": "False",
|
4002 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4003 |
},
|
4004 |
"results": [
|
4005 |
{
|
|
|
4099 |
"retrieval_model": "bge-m3",
|
4100 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4101 |
"reranking_model": "bge-reranker-v2-m3",
|
4102 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4103 |
"task": "long-doc",
|
4104 |
+
"metric": "ndcg_at_1000",
|
4105 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4106 |
+
"is_anonymous": "False",
|
4107 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4108 |
},
|
4109 |
"results": [
|
4110 |
{
|
|
|
4204 |
"retrieval_model": "bge-m3",
|
4205 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4206 |
"reranking_model": "bge-reranker-v2-m3",
|
4207 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4208 |
"task": "long-doc",
|
4209 |
+
"metric": "map_at_1",
|
4210 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4211 |
+
"is_anonymous": "False",
|
4212 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4213 |
},
|
4214 |
"results": [
|
4215 |
{
|
|
|
4309 |
"retrieval_model": "bge-m3",
|
4310 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4311 |
"reranking_model": "bge-reranker-v2-m3",
|
4312 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4313 |
"task": "long-doc",
|
4314 |
+
"metric": "map_at_3",
|
4315 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4316 |
+
"is_anonymous": "False",
|
4317 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4318 |
},
|
4319 |
"results": [
|
4320 |
{
|
|
|
4414 |
"retrieval_model": "bge-m3",
|
4415 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4416 |
"reranking_model": "bge-reranker-v2-m3",
|
4417 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4418 |
"task": "long-doc",
|
4419 |
+
"metric": "map_at_5",
|
4420 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4421 |
+
"is_anonymous": "False",
|
4422 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4423 |
},
|
4424 |
"results": [
|
4425 |
{
|
|
|
4519 |
"retrieval_model": "bge-m3",
|
4520 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4521 |
"reranking_model": "bge-reranker-v2-m3",
|
4522 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4523 |
"task": "long-doc",
|
4524 |
+
"metric": "map_at_10",
|
4525 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4526 |
+
"is_anonymous": "False",
|
4527 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4528 |
},
|
4529 |
"results": [
|
4530 |
{
|
|
|
4624 |
"retrieval_model": "bge-m3",
|
4625 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4626 |
"reranking_model": "bge-reranker-v2-m3",
|
4627 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4628 |
"task": "long-doc",
|
4629 |
+
"metric": "map_at_50",
|
4630 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4631 |
+
"is_anonymous": "False",
|
4632 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4633 |
},
|
4634 |
"results": [
|
4635 |
{
|
|
|
4729 |
"retrieval_model": "bge-m3",
|
4730 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4731 |
"reranking_model": "bge-reranker-v2-m3",
|
4732 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4733 |
"task": "long-doc",
|
4734 |
+
"metric": "map_at_100",
|
4735 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4736 |
+
"is_anonymous": "False",
|
4737 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4738 |
},
|
4739 |
"results": [
|
4740 |
{
|
|
|
4834 |
"retrieval_model": "bge-m3",
|
4835 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4836 |
"reranking_model": "bge-reranker-v2-m3",
|
4837 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4838 |
"task": "long-doc",
|
4839 |
+
"metric": "map_at_1000",
|
4840 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4841 |
+
"is_anonymous": "False",
|
4842 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4843 |
},
|
4844 |
"results": [
|
4845 |
{
|
|
|
4939 |
"retrieval_model": "bge-m3",
|
4940 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
4941 |
"reranking_model": "bge-reranker-v2-m3",
|
4942 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
4943 |
"task": "long-doc",
|
4944 |
+
"metric": "recall_at_1",
|
4945 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
4946 |
+
"is_anonymous": "False",
|
4947 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
4948 |
},
|
4949 |
"results": [
|
4950 |
{
|
|
|
5044 |
"retrieval_model": "bge-m3",
|
5045 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5046 |
"reranking_model": "bge-reranker-v2-m3",
|
5047 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5048 |
"task": "long-doc",
|
5049 |
+
"metric": "recall_at_3",
|
5050 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5051 |
+
"is_anonymous": "False",
|
5052 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5053 |
},
|
5054 |
"results": [
|
5055 |
{
|
|
|
5149 |
"retrieval_model": "bge-m3",
|
5150 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5151 |
"reranking_model": "bge-reranker-v2-m3",
|
5152 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5153 |
"task": "long-doc",
|
5154 |
+
"metric": "recall_at_5",
|
5155 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5156 |
+
"is_anonymous": "False",
|
5157 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5158 |
},
|
5159 |
"results": [
|
5160 |
{
|
|
|
5254 |
"retrieval_model": "bge-m3",
|
5255 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5256 |
"reranking_model": "bge-reranker-v2-m3",
|
5257 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5258 |
"task": "long-doc",
|
5259 |
+
"metric": "recall_at_10",
|
5260 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5261 |
+
"is_anonymous": "False",
|
5262 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5263 |
},
|
5264 |
"results": [
|
5265 |
{
|
|
|
5359 |
"retrieval_model": "bge-m3",
|
5360 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5361 |
"reranking_model": "bge-reranker-v2-m3",
|
5362 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5363 |
"task": "long-doc",
|
5364 |
+
"metric": "recall_at_50",
|
5365 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5366 |
+
"is_anonymous": "False",
|
5367 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5368 |
},
|
5369 |
"results": [
|
5370 |
{
|
|
|
5464 |
"retrieval_model": "bge-m3",
|
5465 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5466 |
"reranking_model": "bge-reranker-v2-m3",
|
5467 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5468 |
"task": "long-doc",
|
5469 |
+
"metric": "recall_at_100",
|
5470 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5471 |
+
"is_anonymous": "False",
|
5472 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5473 |
},
|
5474 |
"results": [
|
5475 |
{
|
|
|
5569 |
"retrieval_model": "bge-m3",
|
5570 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5571 |
"reranking_model": "bge-reranker-v2-m3",
|
5572 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5573 |
"task": "long-doc",
|
5574 |
+
"metric": "recall_at_1000",
|
5575 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5576 |
+
"is_anonymous": "False",
|
5577 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5578 |
},
|
5579 |
"results": [
|
5580 |
{
|
|
|
5674 |
"retrieval_model": "bge-m3",
|
5675 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5676 |
"reranking_model": "bge-reranker-v2-m3",
|
5677 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5678 |
"task": "long-doc",
|
5679 |
+
"metric": "precision_at_1",
|
5680 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5681 |
+
"is_anonymous": "False",
|
5682 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5683 |
},
|
5684 |
"results": [
|
5685 |
{
|
|
|
5779 |
"retrieval_model": "bge-m3",
|
5780 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5781 |
"reranking_model": "bge-reranker-v2-m3",
|
5782 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5783 |
"task": "long-doc",
|
5784 |
+
"metric": "precision_at_3",
|
5785 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5786 |
+
"is_anonymous": "False",
|
5787 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5788 |
},
|
5789 |
"results": [
|
5790 |
{
|
|
|
5884 |
"retrieval_model": "bge-m3",
|
5885 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5886 |
"reranking_model": "bge-reranker-v2-m3",
|
5887 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5888 |
"task": "long-doc",
|
5889 |
+
"metric": "precision_at_5",
|
5890 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5891 |
+
"is_anonymous": "False",
|
5892 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5893 |
},
|
5894 |
"results": [
|
5895 |
{
|
|
|
5989 |
"retrieval_model": "bge-m3",
|
5990 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
5991 |
"reranking_model": "bge-reranker-v2-m3",
|
5992 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
5993 |
"task": "long-doc",
|
5994 |
+
"metric": "precision_at_10",
|
5995 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
5996 |
+
"is_anonymous": "False",
|
5997 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
5998 |
},
|
5999 |
"results": [
|
6000 |
{
|
|
|
6094 |
"retrieval_model": "bge-m3",
|
6095 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6096 |
"reranking_model": "bge-reranker-v2-m3",
|
6097 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6098 |
"task": "long-doc",
|
6099 |
+
"metric": "precision_at_50",
|
6100 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6101 |
+
"is_anonymous": "False",
|
6102 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6103 |
},
|
6104 |
"results": [
|
6105 |
{
|
|
|
6199 |
"retrieval_model": "bge-m3",
|
6200 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6201 |
"reranking_model": "bge-reranker-v2-m3",
|
6202 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6203 |
"task": "long-doc",
|
6204 |
+
"metric": "precision_at_100",
|
6205 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6206 |
+
"is_anonymous": "False",
|
6207 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6208 |
},
|
6209 |
"results": [
|
6210 |
{
|
|
|
6304 |
"retrieval_model": "bge-m3",
|
6305 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6306 |
"reranking_model": "bge-reranker-v2-m3",
|
6307 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6308 |
"task": "long-doc",
|
6309 |
+
"metric": "precision_at_1000",
|
6310 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6311 |
+
"is_anonymous": "False",
|
6312 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6313 |
},
|
6314 |
"results": [
|
6315 |
{
|
|
|
6409 |
"retrieval_model": "bge-m3",
|
6410 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6411 |
"reranking_model": "bge-reranker-v2-m3",
|
6412 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6413 |
"task": "long-doc",
|
6414 |
+
"metric": "mrr_at_1",
|
6415 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6416 |
+
"is_anonymous": "False",
|
6417 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6418 |
},
|
6419 |
"results": [
|
6420 |
{
|
|
|
6514 |
"retrieval_model": "bge-m3",
|
6515 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6516 |
"reranking_model": "bge-reranker-v2-m3",
|
6517 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6518 |
"task": "long-doc",
|
6519 |
+
"metric": "mrr_at_3",
|
6520 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6521 |
+
"is_anonymous": "False",
|
6522 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6523 |
},
|
6524 |
"results": [
|
6525 |
{
|
|
|
6619 |
"retrieval_model": "bge-m3",
|
6620 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6621 |
"reranking_model": "bge-reranker-v2-m3",
|
6622 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6623 |
"task": "long-doc",
|
6624 |
+
"metric": "mrr_at_5",
|
6625 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6626 |
+
"is_anonymous": "False",
|
6627 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6628 |
},
|
6629 |
"results": [
|
6630 |
{
|
|
|
6724 |
"retrieval_model": "bge-m3",
|
6725 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6726 |
"reranking_model": "bge-reranker-v2-m3",
|
6727 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6728 |
"task": "long-doc",
|
6729 |
+
"metric": "mrr_at_10",
|
6730 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6731 |
+
"is_anonymous": "False",
|
6732 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6733 |
},
|
6734 |
"results": [
|
6735 |
{
|
|
|
6829 |
"retrieval_model": "bge-m3",
|
6830 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6831 |
"reranking_model": "bge-reranker-v2-m3",
|
6832 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6833 |
"task": "long-doc",
|
6834 |
+
"metric": "mrr_at_50",
|
6835 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6836 |
+
"is_anonymous": "False",
|
6837 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6838 |
},
|
6839 |
"results": [
|
6840 |
{
|
|
|
6934 |
"retrieval_model": "bge-m3",
|
6935 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
6936 |
"reranking_model": "bge-reranker-v2-m3",
|
6937 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
6938 |
"task": "long-doc",
|
6939 |
+
"metric": "mrr_at_100",
|
6940 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
6941 |
+
"is_anonymous": "False",
|
6942 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
6943 |
},
|
6944 |
"results": [
|
6945 |
{
|
|
|
7039 |
"retrieval_model": "bge-m3",
|
7040 |
"retrieval_model_link": "https://huggingface.co/BAAI/bge-m3",
|
7041 |
"reranking_model": "bge-reranker-v2-m3",
|
7042 |
+
"reranking_model_link": "https://huggingface.co/BAAI/bge-reranker-v2-m3",
|
7043 |
"task": "long-doc",
|
7044 |
+
"metric": "mrr_at_1000",
|
7045 |
+
"timestamp": "2024-05-13T17:42:59Z",
|
7046 |
+
"is_anonymous": "False",
|
7047 |
+
"revision": "3fc82c8b49ec8b8d89d6268dc253cc1c"
|
7048 |
},
|
7049 |
"results": [
|
7050 |
{
|
AIR-Bench_24.04/bge-m3/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-m3/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-m3/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/bge-reranker-v2-m3/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/bge-small-en-v1.5/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/bge-reranker-v2-m3/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/e5-mistral-7b-instruct/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/bge-reranker-v2-m3/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/jina-embeddings-v2-base-en/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/NoReranker/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/bce-reranker-base_v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/bge-reranker-large/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/bge-reranker-v2-m3/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/jina-reranker-v1-tiny-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/jina-reranker-v1-turbo-en/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
AIR-Bench_24.04/multilingual-e5-base/mmarco-mMiniLMv2-L12-H384-v1/results.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|