Update transformers snippet

#36
by pcuenq HF staff - opened
Meta Llama org

Not adding special tokens because processor.apply_chat_template already adds them, so we'd end up with two BOS tokens.

Thanks to @Narsil for identifying the issue and proposing this workaround.

Meta Llama org

Hi! Thanks for this PR. Can we change the processor so that it will not add bos_token if the first token is bos_token? I think many people will easily forget to use add_special_tokens=False option.

Sanyam changed pull request status to merged

Sign up or log in to comment