datasets: | |
- cerebras/SlimPajama-627B | |
tags: | |
- ggml | |
# BTLM-3B GGML | |
Bittensor Language Model (BTLM-3B-8k-base) is a 3 billion parameter language model with an 8k context length trained on 627B tokens of SlimPajama. | |
> This is just model conversion into ggml format, model is not actaully implemented so you cannot use it. Stay tuned! | |
Ref: https://huggingface.co/cerebras/btlm-3b-8k-base | |
GGML ISSUE https://github.com/ggerganov/ggml/issues/427 | |