Text Generation
Transformers
bloom
Eval Results
bloomz-176B-GPTQ / tokenizer_config.json
TheBloke's picture
Upload GPTQ split into three parts to avoid HF upload limit
c7a4158
raw
history blame contribute delete
222 Bytes
{"unk_token": "<unk>", "eos_token": "</s>", "bos_token": "<s>", "pad_token": "<pad>", "name_or_path": "bigscience/tokenizer", "special_tokens_map_file": null, "tokenizer_class":"BloomTokenizerFast", "padding_side":"left"}