Update README.md
Browse files
README.md
CHANGED
@@ -13,8 +13,10 @@ We introduce Meltemi, the first Greek Large Language Model (LLM) trained by the
|
|
13 |
Meltemi is built on top of [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1), extending its capabilities for Greek through continual pretraining on a large corpus of high-quality and locally relevant Greek texts. We present Meltemi-7B-v1, as well as an instruction fine-tuned version [Meltemi-7B-Instruct-v1](https://huggingface.co/ilsp/Meltemi-7B-Instruct-v1).
|
14 |
|
15 |
|
16 |
-
|
17 |
-
|
|
|
|
|
18 |
|
19 |
|
20 |
![image/png](https://miro.medium.com/v2/resize:fit:720/format:webp/1*IaE7RJk6JffW8og-MOnYCA.png)
|
|
|
13 |
Meltemi is built on top of [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1), extending its capabilities for Greek through continual pretraining on a large corpus of high-quality and locally relevant Greek texts. We present Meltemi-7B-v1, as well as an instruction fine-tuned version [Meltemi-7B-Instruct-v1](https://huggingface.co/ilsp/Meltemi-7B-Instruct-v1).
|
14 |
|
15 |
|
16 |
+
|
17 |
+
# 🚨 NEWER VERSION AVAILABLE
|
18 |
+
## **This model has been superseded by a newer version (v1.5) [here](https://huggingface.co/ilsp/Meltemi-7B-v1.5)**
|
19 |
+
|
20 |
|
21 |
|
22 |
![image/png](https://miro.medium.com/v2/resize:fit:720/format:webp/1*IaE7RJk6JffW8og-MOnYCA.png)
|