File size: 416 Bytes
e1642a7 f6a82de e1642a7 f6a82de e1642a7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
datasets:
- allenai/dolma
language:
- en
library_name: transformers
license: apache-2.0
tags:
- causal-lm
---
## Model Details
### Training
Models trained using [litgpt](https://github.com/Lightning-AI/litgpt) and [AxoNN](https://github.com/axonn-ai/litgpt) on AMD MI250 GPUs.
### Data
Train and validation data is taken from non-overlapping subsets of [dolma](https://huggingface.co/datasets/allenai/dolma).
|