SusChat not generating the Output
Hi, I tried setting up the Model on my local computer. However, the Model does not perform exactly as it does on https://huggingface.co/spaces/SUSTech/SUS-Chat-34B. Sometimes it outputs the expected output but sometimes it does not output anything. Am I doing something incorrect with the code? Is the Model I downloaded the same as on the UI? This is the code I used:-
For context, I am opening the CSV file to read the value and compare it against the 1000 which is not currently implemented on the code for testing.
- Use prompt like ### Human: [your prompt here]\n\n###Assistant: for completion.
- HF online space version is run in bfloat16 precision instead of llama-cpp quantilized version, so there may exist some differences between the space and your local env (We run this model in VLLM).
Hi, Thank you for replying. I have used the code provided in the SusTech/ Sus-Chat's Model Card. When I run the same prompt in the UI, the accuracy of the output is 95% but when ran locally is 75%. I think the main problem is related to the chat template function. What should I need to make changes in this code to fix this issue?
Thanks.