STD-BPE-CEREBRAS

A standard CerebrasGPT-111M model using a pretrained Byte-Pair-Encoding (BPE) tokenizer. This model is used as a baseline for understanding how pretrained tokenizers perform on Danish text.

Downloads last month
21
Safetensors
Model size
111M params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including meelu/STD-BPE-CEREBRAS