GGUF
llama.cpp

[7b-it-GGUF] mismatch in special tokens definition

#2
by dvappco - opened

Number of defined special tokens differs between what was initially expected and what's currently being used. i'm unsure whether this discrepancy could potentially cause any issues during model inference?

llm_load_vocab: mismatch in special tokens definition ( 544/256128 vs 388/256128 )

I'm not aware of any issues yet.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment