Is there no BOS token?

#6
by viktoroo - opened

There have been a number of studies suggesting that a BOS token is very important for Llama models. SmolLM models are architecturally based on Llama, but do they borrow the addition of BOS token from Llama as well?

It seems like by default, the token is not added by the model tokenizer. Is it a bug? Should we add it manually when using SmolLM models?

model_name = 'HuggingFaceTB/SmolLM2-135M'
tokenizer = AutoTokenizer.from_pretrained(model_name)
tokenizer.encode('test', add_special_tokens=True)

Outputs [2129]. On the contrary, when Llama tokenizer is used,

model_name = 'meta-llama/Meta-Llama-3-8B'
tokenizer = AutoTokenizer.from_pretrained(model_name)
tokenizer.encode('test', add_special_tokens=True)

The output is [128000, 1985], it adds the BOS token (128000)

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment