Update README.md
Browse files
README.md
CHANGED
@@ -15,14 +15,14 @@ language:
|
|
15 |
- ja
|
16 |
---
|
17 |
|
18 |
-
# Japanese
|
19 |
|
20 |
## Model Description
|
21 |
|
22 |
This is a 7B-parameter decoder-only language model with a focus on maximizing Japanese language modeling performance and Japanese downstream task performance.
|
23 |
We conducted continued pretraining using Japanese data on the English language model, [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1), to transfer the model's knowledge and capabilities to Japanese.
|
24 |
|
25 |
-
*If you are looking for an instruction-following model, check [Japanese
|
26 |
|
27 |
|
28 |
## Usage
|
@@ -50,7 +50,7 @@ print(tokenizer.decode(tokens[0], skip_special_tokens=True))
|
|
50 |
## Model Details
|
51 |
|
52 |
* **Developed by**: [Stability AI](https://stability.ai/)
|
53 |
-
* **Model type**: `Japanese
|
54 |
* **Language(s)**: Japanese
|
55 |
* **License**: This model is licensed under [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0).
|
56 |
* **Contact**: For questions and comments about the model, please email `[email protected]`
|
|
|
15 |
- ja
|
16 |
---
|
17 |
|
18 |
+
# Japanese Stable LM Base Gamma 7B
|
19 |
|
20 |
## Model Description
|
21 |
|
22 |
This is a 7B-parameter decoder-only language model with a focus on maximizing Japanese language modeling performance and Japanese downstream task performance.
|
23 |
We conducted continued pretraining using Japanese data on the English language model, [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1), to transfer the model's knowledge and capabilities to Japanese.
|
24 |
|
25 |
+
*If you are looking for an instruction-following model, check [Japanese Stable LM Instruct Gamma 7B](https://huggingface.co/stabilityai/japanese-stablelm-instruct-gamma7b)*.
|
26 |
|
27 |
|
28 |
## Usage
|
|
|
50 |
## Model Details
|
51 |
|
52 |
* **Developed by**: [Stability AI](https://stability.ai/)
|
53 |
+
* **Model type**: `Japanese Stable LM Base Gamma 7B` model is an auto-regressive language model based on the transformer decoder architecture.
|
54 |
* **Language(s)**: Japanese
|
55 |
* **License**: This model is licensed under [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0).
|
56 |
* **Contact**: For questions and comments about the model, please email `[email protected]`
|