sippycoder
commited on
Commit
•
9cafadd
1
Parent(s):
883ff9f
Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ language:
|
|
13 |
|
14 |
## What about Nucleus-22B-token-500B?
|
15 |
|
16 |
-
* **It performs well compared to similar-size open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on
|
17 |
* **It is made available under an MIT license**.
|
18 |
* **It is trained by a small team of four passionate for Open Source**
|
19 |
|
|
|
13 |
|
14 |
## What about Nucleus-22B-token-500B?
|
15 |
|
16 |
+
* **It performs well compared to similar-size open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
|
17 |
* **It is made available under an MIT license**.
|
18 |
* **It is trained by a small team of four passionate for Open Source**
|
19 |
|