LLaMA-65B-HF / tokenizer_config.json
Neko-Institute-of-Science's picture
LLaMA: convert_llama_weights_to_hf.py as of c0f99b4d2ec73090595914dde4c16da207e21d73
dff3a11
raw
history blame
141 Bytes
{"bos_token": "", "eos_token": "", "model_max_length": 1000000000000000019884624838656, "tokenizer_class": "LlamaTokenizer", "unk_token": ""}