agentic-Transformer / tokenizer.json
dnnsdunca's picture
Create tokenizer.json
e212c69 verified
raw
history blame contribute delete
234 Bytes
{
"tokenizer_type": "bert",
"vocab_size": 30522,
"pad_token_id": 0,
"cls_token_id": 101,
"sep_token_id": 102,
"unk_token_id": 103,
"mask_token_id": 104,
"max_len": 512,
"truncation": true,
"padding": "max_length"
}