xu-song's picture
add more tokenizers
f4973d4
raw
history blame
173 Bytes
{
"model_max_length": 8192,
"tokenizer_class": "QWenTokenizer",
"auto_map": {
"AutoTokenizer": [
"tokenization_qwen.QWenTokenizer",
null
]
}
}