Tochka-AI commited on
Commit
4b87ee8
1 Parent(s): 901bfb3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ This is an encoder model from **Tochka AI** based on the **RoPEBert** architectu
13
 
14
  [CulturaX](https://huggingface.co/papers/2309.09400) dataset was used for model training. The **hivaze/ru-e5-base** (only english and russian embeddings of **intfloat/multilingual-e5-base**) model was used as the original; this model surpasses it and all other models in quality (at the time of creation), according to the `S+W` score of [encodechka](https://github.com/avidale/encodechka) benchmark.
15
 
16
- The model source code is available in the file [modeling_rope_bert.py](https://huggingface.co/Tochka-AI/ruRoPEBert-classic-base-2k/blob/main/modeling_rope_bert.py)
17
 
18
  The model is trained on contexts **up to 2048 tokens** in length, but can be used on larger contexts.
19
 
 
13
 
14
  [CulturaX](https://huggingface.co/papers/2309.09400) dataset was used for model training. The **hivaze/ru-e5-base** (only english and russian embeddings of **intfloat/multilingual-e5-base**) model was used as the original; this model surpasses it and all other models in quality (at the time of creation), according to the `S+W` score of [encodechka](https://github.com/avidale/encodechka) benchmark.
15
 
16
+ The model source code is available in the file [modeling_rope_bert.py](https://huggingface.co/Tochka-AI/ruRoPEBert-e5-base-2k/blob/main/modeling_rope_bert.py)
17
 
18
  The model is trained on contexts **up to 2048 tokens** in length, but can be used on larger contexts.
19