plotly_gpt_neo_1_3B / special_tokens_map.json
lagodw's picture
full sample tokenizer
b6c0a87
raw
history blame
215 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": {"content": "<UNK>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "sep_token": "<SEP>", "pad_token": "<PAD>"}