Conintually getting nonsense output

#3
by nahsor - opened

WHen prompted "hello", I get a bunch of garbade output. Happens everytime even after restarting ollama and the terminal

image.png

i have the same issue

Try to turn off flash attention

disabling flash attention doesn't help. I think it's because the template is broken

# ollama show --modelfile hf.co/TheDrummer/Cydonia-24B-v2-GGUF:Q8_0
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM hf.co/TheDrummer/Cydonia-24B-v2-GGUF:Q8_0

FROM /root/.ollama/models/blobs/sha256-9a517a14566ec4c21be623bd464e0f27b1387ac780164ed044ac19c89ad08ac1
TEMPLATE {{ .Prompt }}

for mistral-small:24b-instruct-2501-q8_0 modelfile look like this:

# ollama show --modelfile mistral-small:24b-instruct-2501-q8_0
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM mistral-small:24b-instruct-2501-q8_0

FROM /root/.ollama/models/blobs/sha256-a58ad27c3b12c567ca6a60806696a7689f9cf929f2a56aa7189c5908e60e7222
TEMPLATE """{{- range $index, $_ := .Messages }}
{{- if eq .Role "system" }}[SYSTEM_PROMPT]{{ .Content }}[/SYSTEM_PROMPT]
{{- else if eq .Role "user" }}
{{- if and (le (len (slice $.Messages $index)) 2) $.Tools }}[AVAILABLE_TOOLS]{{ $.Tools }}[/AVAILABLE_TOOLS]
{{- end }}[INST]{{ .Content }}[/INST]
{{- else if eq .Role "assistant" }}
{{- if .Content }}{{ .Content }}
{{- if not (eq (len (slice $.Messages $index)) 1) }}</s>
{{- end }}
{{- else if .ToolCalls }}[TOOL_CALLS][
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
{{- end }}]</s>
{{- end }}
{{- else if eq .Role "tool" }}[TOOL_RESULTS]{"content": {{ .Content }}}[/TOOL_RESULTS]
{{- end }}
{{- end }}"""
SYSTEM You are Mistral Small 3, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris. Your knowledge base was last updated on 2023-10-01. When you're not sure about some information, you say that you don't have the information and don't make up anything. If the user's question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. "What are some good restaurants around me?" => "Where are you?" or "When is the next flight to Tokyo" => "Where do you travel from?")
PARAMETER temperature 0.15

this is works:

# ollama show --modelfile Cydonia-24B-v2-Q8_0                 
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM Cydonia-24B-v2-Q8_0:latest

FROM /root/.ollama/models/blobs/sha256-9a517a14566ec4c21be623bd464e0f27b1387ac780164ed044ac19c89ad08ac1
TEMPLATE """{{- range $index, $_ := .Messages }}
{{- if eq .Role "system" }}[SYSTEM_PROMPT]{{ .Content }}[/SYSTEM_PROMPT]
{{- else if eq .Role "user" }}
{{- if and (le (len (slice $.Messages $index)) 2) $.Tools }}[AVAILABLE_TOOLS]{{ $.Tools }}[/AVAILABLE_TOOLS]
{{- end }}[INST]{{ .Content }}[/INST]
{{- else if eq .Role "assistant" }}
{{- if .Content }}{{ .Content }}
{{- if not (eq (len (slice $.Messages $index)) 1) }}</s>
{{- end }}
{{- else if .ToolCalls }}[TOOL_CALLS][
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
{{- end }}]</s>
{{- end }}
{{- else if eq .Role "tool" }}[TOOL_RESULTS]{"content": {{ .Content }}}[/TOOL_RESULTS]
{{- end }}
{{- end }}"""

any fix for this @TheDrummer

@nahsor you need to create a new Modelfile and then create a new custom model based on it. https://github.com/ollama/ollama/blob/main/docs/modelfile.md

@TheDrummer , but in my opinion the best solution would be to publish a tested and working version in the Ollama library with a model card etc.

Sign up or log in to comment