Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,7 @@ datasets:
|
|
24 |
|
25 |
`Stable LM 2 12B` is a 12.1 billion parameter decoder-only language model pre-trained on 2 trillion tokens of diverse multilingual and code datasets for two epochs.
|
26 |
|
27 |
-
Please note: For commercial use, please refer to https://stability.ai/
|
28 |
|
29 |
## Usage
|
30 |
|
@@ -85,7 +85,7 @@ print(tokenizer.decode(tokens[0], skip_special_tokens=True))
|
|
85 |
* **Language(s)**: English
|
86 |
* **Paper**: [Stable LM 2 Technical Report](https://arxiv.org/abs/2402.17834)
|
87 |
* **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
|
88 |
-
* **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-12b/blob/main/LICENSE).
|
89 |
* **Commercial License**: to use this model commercially, please refer to https://stability.ai/membership
|
90 |
* **Contact**: For questions and comments about the model, please email `[email protected]`
|
91 |
|
|
|
24 |
|
25 |
`Stable LM 2 12B` is a 12.1 billion parameter decoder-only language model pre-trained on 2 trillion tokens of diverse multilingual and code datasets for two epochs.
|
26 |
|
27 |
+
Please note: For commercial use, please refer to https://stability.ai/license.
|
28 |
|
29 |
## Usage
|
30 |
|
|
|
85 |
* **Language(s)**: English
|
86 |
* **Paper**: [Stable LM 2 Technical Report](https://arxiv.org/abs/2402.17834)
|
87 |
* **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
|
88 |
+
* **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-12b/blob/main/LICENSE.md).
|
89 |
* **Commercial License**: to use this model commercially, please refer to https://stability.ai/membership
|
90 |
* **Contact**: For questions and comments about the model, please email `[email protected]`
|
91 |
|