Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -9,8 +9,8 @@ datasets:
|
|
9 |
- allenai/dolma
|
10 |
---
|
11 |
|
12 |
-
# Gemstone-
|
13 |
-
Gemstone-
|
14 |
|
15 |
## Training
|
16 |
We train using [litgpt](https://github.com/Lightning-AI/litgpt) and [AxoNN](https://github.com/axonn-ai/litgpt) using AMD MI250X GPUs on [Frontier](https://www.olcf.ornl.gov/olcf-resources/compute-systems/frontier/) at Oak Ridge National Laboratory with a global batch size of 2048.
|
@@ -19,7 +19,7 @@ We train using [litgpt](https://github.com/Lightning-AI/litgpt) and [AxoNN](http
|
|
19 |
Train and validation data is taken from non-overlapping subsets of [dolma](https://huggingface.co/datasets/allenai/dolma). As such it is _not_ an instruction model.
|
20 |
This model is trained for 350 billion tokens, we upload checkpoints every 2 billion tokens (477 steps).
|
21 |
|
22 |
-
## Using Gemstone-
|
23 |
The Gemstones are based on the [gemma-2b](https://huggingface.co/google/gemma-2b) architecture and use [modeling_gemma.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/gemma/modeling_gemma.py) to run using the transformers library.
|
24 |
|
25 |
## Licence
|
|
|
9 |
- allenai/dolma
|
10 |
---
|
11 |
|
12 |
+
# Gemstone-1536x50
|
13 |
+
Gemstone-1536x50 is part of the [Gemstone Suite of Models](https://huggingface.co/collections/tomg-group-umd/gemstone-models-679408ee3f19f1d4d00e8b10). A set of models trained with varying widths and depths.
|
14 |
|
15 |
## Training
|
16 |
We train using [litgpt](https://github.com/Lightning-AI/litgpt) and [AxoNN](https://github.com/axonn-ai/litgpt) using AMD MI250X GPUs on [Frontier](https://www.olcf.ornl.gov/olcf-resources/compute-systems/frontier/) at Oak Ridge National Laboratory with a global batch size of 2048.
|
|
|
19 |
Train and validation data is taken from non-overlapping subsets of [dolma](https://huggingface.co/datasets/allenai/dolma). As such it is _not_ an instruction model.
|
20 |
This model is trained for 350 billion tokens, we upload checkpoints every 2 billion tokens (477 steps).
|
21 |
|
22 |
+
## Using Gemstone-1536x50
|
23 |
The Gemstones are based on the [gemma-2b](https://huggingface.co/google/gemma-2b) architecture and use [modeling_gemma.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/gemma/modeling_gemma.py) to run using the transformers library.
|
24 |
|
25 |
## Licence
|