Fill-Mask
Transformers
PyTorch
Japanese
bert
Inference Endpoints
aken12 commited on
Commit
b919476
1 Parent(s): 9928cc8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -12,7 +12,7 @@ SPLADE-japanese-v2 !!
12
  Difference between splade-japanese v1 and v2
13
  - initialize [tohoku-nlp/bert-base-japanese-v3](https://huggingface.co/tohoku-nlp/bert-base-japanese-v3)
14
  - knowledge distillation from cross-encoder
15
- - [mMARCO](https://github.com/unicamp-dl/mMARCO) Japanese dataset and use bclavie/mmarco-japanese-hard-negatives as hard negatives
16
 
17
 
18
  you need to install
 
12
  Difference between splade-japanese v1 and v2
13
  - initialize [tohoku-nlp/bert-base-japanese-v3](https://huggingface.co/tohoku-nlp/bert-base-japanese-v3)
14
  - knowledge distillation from cross-encoder
15
+ - [mMARCO](https://github.com/unicamp-dl/mMARCO) Japanese dataset and use [bclavie/mmarco-japanese-hard-negatives](https://huggingface.co/datasets/bclavie/mmarco-japanese-hard-negatives) as hard negatives
16
 
17
 
18
  you need to install