Update README.md
Browse files
README.md
CHANGED
@@ -29,6 +29,11 @@ It is based on a BERT architecture (JinaBERT) that supports the symmetric bidire
|
|
29 |
We have designed it for high performance in mono-lingual & cross-lingual applications and trained it specifically to support mixed Spanish-English input without bias.
|
30 |
Additionally, we provide the following embedding models:
|
31 |
|
|
|
|
|
|
|
|
|
|
|
32 |
- [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters.
|
33 |
- [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters.
|
34 |
- [`jina-embeddings-v2-base-zh`](): Chinese-English Bilingual embeddings (soon).
|
|
|
29 |
We have designed it for high performance in mono-lingual & cross-lingual applications and trained it specifically to support mixed Spanish-English input without bias.
|
30 |
Additionally, we provide the following embedding models:
|
31 |
|
32 |
+
`jina-embeddings-v2-base-es` es un modelo (embedding) de texto bilingüe Inglés/Español que admite una longitud de secuencia de 8192.
|
33 |
+
Se basa en la arquitectura BERT (JinaBERT) que incorpora la variante bi-direccional simétrica de [ALiBi](https://arxiv.org/abs/2108.12409) para permitir una mayor longitud de secuencia.
|
34 |
+
Hemos diseñado este modelo para un alto rendimiento en aplicaciones monolingües y bilingües, y está entrenando específicamente para admitir entradas mixtas de español e inglés sin sesgo.
|
35 |
+
Adicionalmente, proporcionamos los siguientes modelos (embeddings):
|
36 |
+
|
37 |
- [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters.
|
38 |
- [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters.
|
39 |
- [`jina-embeddings-v2-base-zh`](): Chinese-English Bilingual embeddings (soon).
|