Chat Template absent

#1
by DeProgrammer - opened

FYA, there's no tokenizer.chat_template in this GGUF, so running the model with llama.cpp won't give good results (unless they've hard-coded a template for it by now, but I was using LLamaSharp 0.21 and discovered it was using the ChatML template instead). This is probably the right template: https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501/discussions/9#679bdf377a4232b5aa9f337c

This is the base model, if you want the model with a chat template you want the instruct model here:

https://huggingface.co/bartowski/Mistral-Small-24B-Instruct-2501-GGUF

Oh, sorry! I made the fundamental mistake of downloading the wrong model. πŸ™ƒ

DeProgrammer changed discussion status to closed

Sign up or log in to comment