kevinkrahn
commited on
Commit
•
9e45474
1
Parent(s):
80adea0
Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ tags:
|
|
14 |
|
15 |
## Sentence embeddings for English and Ancient Greek
|
16 |
|
17 |
-
The HLM model architecture is based on [Heidelberg-Boston @ SIGTYP 2024 Shared Task: Enhancing Low-Resource Language Analysis With Character-Aware Hierarchical Transformers](https://aclanthology.org/2024.sigtyp-1.16/) but uses a simpler architecture with rotary embeddings instead of using DeBERTa as a base architecture. This architecture produces superior results compared to the vanilla BERT architecture for low-resource languages like Ancient Greek. It is trained to produce sentence embeddings using the method described in [Sentence Embedding Models for Ancient Greek Using
|
18 |
|
19 |
This model was distilled from `BAAI/bge-base-en-v1.5` for embedding English and Ancient Greek text.
|
20 |
|
|
|
14 |
|
15 |
## Sentence embeddings for English and Ancient Greek
|
16 |
|
17 |
+
The HLM model architecture is based on [Heidelberg-Boston @ SIGTYP 2024 Shared Task: Enhancing Low-Resource Language Analysis With Character-Aware Hierarchical Transformers](https://aclanthology.org/2024.sigtyp-1.16/) but uses a simpler architecture with rotary embeddings instead of using DeBERTa as a base architecture. This architecture produces superior results compared to the vanilla BERT architecture for low-resource languages like Ancient Greek. It is trained to produce sentence embeddings using the method described in [Sentence Embedding Models for Ancient Greek Using Multilingual Knowledge Distillation](https://aclanthology.org/2023.alp-1.2/).
|
18 |
|
19 |
This model was distilled from `BAAI/bge-base-en-v1.5` for embedding English and Ancient Greek text.
|
20 |
|