asafaya commited on
Commit
7cead29
·
verified ·
1 Parent(s): a038db6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -15,9 +15,9 @@ widget:
15
 
16
  # Kanarya-2B: Turkish Language Model
17
 
18
- <img src="https://asafaya.me/images/kanarya.webp" alt="Kanarya Logo" style="width:300px;"/>
19
 
20
- Kanarya is a pre-trained Turkish GPT-J 2B model. Released as part of [Turkish Data Depository](https://tdd.ai/) efforts, the Kanarya family has two versions (Kanarya-2B, Kanarya-0.7B). Kanarya-2B is the larger version and Kanarya-0.7B is the smaller version. Both models are trained on a large-scale Turkish text corpus, filtered from OSCAR and mC4 datasets. The training data is collected from various sources, including news, articles, and websites, to create a diverse and high-quality dataset. The models are trained using a JAX/Flax implementation of the [GPT-J](https://github.com/kingoflolz/mesh-transformer-jax) architecture.
21
 
22
  ## Model Details
23
 
 
15
 
16
  # Kanarya-2B: Turkish Language Model
17
 
18
+ <img src="https://asafaya.me/images/kanarya.webp" alt="Kanarya Logo" style="width:400px;"/>
19
 
20
+ **Kanarya** is a pre-trained Turkish GPT-J 2B model. Released as part of [Turkish Data Depository](https://tdd.ai/) efforts, the Kanarya family has two versions (Kanarya-2B, Kanarya-0.7B). Kanarya-2B is the larger version and Kanarya-0.7B is the smaller version. Both models are trained on a large-scale Turkish text corpus, filtered from OSCAR and mC4 datasets. The training data is collected from various sources, including news, articles, and websites, to create a diverse and high-quality dataset. The models are trained using a JAX/Flax implementation of the [GPT-J](https://github.com/kingoflolz/mesh-transformer-jax) architecture.
21
 
22
  ## Model Details
23