Fill-Mask
Transformers
PyTorch
eurobert
code
custom_code
Nicolas-BZRD commited on
Commit
c7a6766
·
verified ·
1 Parent(s): b023e4a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -39,13 +39,13 @@ tags:
39
  EuroBERT is a family of multilingual encoder models designed for a variety of tasks—such as classification, retrieval, or evaluation metrics— supporting 15 languages, mathematics and code, and sequences of up to 8,192 tokens.
40
  EuroBERT models exhibit the strongest multilingual performance across [domains and tasks](#Evaluation) compared to similarly sized systems.
41
 
42
- It is available in the following sizes:
43
 
44
- - [EuroBERT-210m](https://huggingface.co/EuroBERT/EuroBERT-210m) - 12 layers, 210 million parameters
45
- - [EuroBERT-610m](https://huggingface.co/EuroBERT/EuroBERT-610m) - 26 layers, 610 million parameters
46
- - [EuroBERT-2.1B](https://huggingface.co/EuroBERT/EuroBERT-2.1B) - 32 layers, 2.1 billion parameters
47
 
48
- For more information about EuroBERT, please refer to the [release blog post](***) for a high-level overview and our [arXiv pre-print](***) for in-depth information.
49
 
50
  ## Usage
51
 
 
39
  EuroBERT is a family of multilingual encoder models designed for a variety of tasks—such as classification, retrieval, or evaluation metrics— supporting 15 languages, mathematics and code, and sequences of up to 8,192 tokens.
40
  EuroBERT models exhibit the strongest multilingual performance across [domains and tasks](#Evaluation) compared to similarly sized systems.
41
 
42
+ It is available in 3 sizes:
43
 
44
+ - [EuroBERT-210m](https://huggingface.co/EuroBERT/EuroBERT-210m) - 210 million parameters
45
+ - [EuroBERT-610m](https://huggingface.co/EuroBERT/EuroBERT-610m) - 610 million parameters
46
+ - [EuroBERT-2.1B](https://huggingface.co/EuroBERT/EuroBERT-2.1B) - 2.1 billion parameters
47
 
48
+ For more information about EuroBERT, please check our [blog](***) post and the [arXiv](***) preprint.
49
 
50
  ## Usage
51