Update README.md
Browse files
README.md
CHANGED
@@ -40,7 +40,7 @@ GPT-2/GPT-3.
|
|
40 |
|
41 |
## Training data
|
42 |
|
43 |
-
GPT-J 6B was pretrained on the [Pile](pile.eleuther.ai), a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on
|
44 |
|
45 |
### How to use
|
46 |
|
|
|
40 |
|
41 |
## Training data
|
42 |
|
43 |
+
GPT-J 6B was pretrained on the [Pile](pile.eleuther.ai), a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on our Japanese storytelling dataset. Check our blog post for more details.
|
44 |
|
45 |
### How to use
|
46 |
|