Update README.md
Browse files
README.md
CHANGED
@@ -15,6 +15,9 @@ library_name: transformers
|
|
15 |
|
16 |
ZeroXClem/Qwen2.5-7B-Qandora-CySec is an advanced model merge combining Q&A capabilities and cybersecurity expertise using the mergekit framework. This model excels in both general question-answering tasks and specialized cybersecurity domains.
|
17 |
|
|
|
|
|
|
|
18 |
## ๐ Model Components
|
19 |
|
20 |
- **[bunnycore/QandoraExp-7B](https://huggingface.co/bunnycore/QandoraExp-7B)**: Powerful Q&A capabilities
|
@@ -56,7 +59,70 @@ dtype: bfloat16
|
|
56 |
2. Cybersecurity Analysis
|
57 |
3. Hybrid Scenarios (general knowledge + cybersecurity)
|
58 |
|
59 |
-
##
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
60 |
|
61 |
```python
|
62 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
|
|
15 |
|
16 |
ZeroXClem/Qwen2.5-7B-Qandora-CySec is an advanced model merge combining Q&A capabilities and cybersecurity expertise using the mergekit framework. This model excels in both general question-answering tasks and specialized cybersecurity domains.
|
17 |
|
18 |
+
### ๐ฌ Quants
|
19 |
+
ZeroXClem/Qwen2.5-7B-Qandora-CySec quantized in GGUF format can be [found here:](https://huggingface.co/models?other=base_model:quantized:ZeroXClem/Qwen2.5-7B-Qandora-CySec)
|
20 |
+
|
21 |
## ๐ Model Components
|
22 |
|
23 |
- **[bunnycore/QandoraExp-7B](https://huggingface.co/bunnycore/QandoraExp-7B)**: Powerful Q&A capabilities
|
|
|
59 |
2. Cybersecurity Analysis
|
60 |
3. Hybrid Scenarios (general knowledge + cybersecurity)
|
61 |
|
62 |
+
## Ollama Model Card
|
63 |
+
|
64 |
+
The [GGUF quantized versions](https://huggingface.co/models?other=base_model:quantized:ZeroXClem/Qwen2.5-7B-Qandora-CySec) can be used directly in Ollama using the following model card. Simple save as Modelfile in the same directory.
|
65 |
+
|
66 |
+
```Modelfile
|
67 |
+
FROM ./qwen2.5-7b-qandora-cysec-q5_0.gguf
|
68 |
+
|
69 |
+
# set the temperature to 1 [higher is more creative, lower is more coherent]
|
70 |
+
PARAMETER temperature 0.7
|
71 |
+
PARAMETER top_p 0.8
|
72 |
+
PARAMETER repeat_penalty 1.05
|
73 |
+
PARAMETER top_k 20
|
74 |
+
|
75 |
+
TEMPLATE """{{ if .Messages }}
|
76 |
+
{{- if or .System .Tools }}<|im_start|>system
|
77 |
+
{{ .System }}
|
78 |
+
{{- if .Tools }}
|
79 |
+
|
80 |
+
# Tools
|
81 |
+
|
82 |
+
You are provided with function signatures within <tools></tools> XML tags:
|
83 |
+
<tools>{{- range .Tools }}
|
84 |
+
{"type": "function", "function": {{ .Function }}}{{- end }}
|
85 |
+
</tools>
|
86 |
+
|
87 |
+
For each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:
|
88 |
+
<tool_call>
|
89 |
+
{"name": <function-name>, "arguments": <args-json-object>}
|
90 |
+
</tool_call>
|
91 |
+
{{- end }}<|im_end|>
|
92 |
+
{{ end }}
|
93 |
+
{{- range $i, $_ := .Messages }}
|
94 |
+
{{- $last := eq (len (slice $.Messages $i)) 1 -}}
|
95 |
+
{{- if eq .Role "user" }}<|im_start|>user
|
96 |
+
{{ .Content }}<|im_end|>
|
97 |
+
{{ else if eq .Role "assistant" }}<|im_start|>assistant
|
98 |
+
{{ if .Content }}{{ .Content }}
|
99 |
+
{{- else if .ToolCalls }}<tool_call>
|
100 |
+
{{ range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
|
101 |
+
{{ end }}</tool_call>
|
102 |
+
{{- end }}{{ if not $last }}<|im_end|>
|
103 |
+
{{ end }}
|
104 |
+
{{- else if eq .Role "tool" }}<|im_start|>user
|
105 |
+
<tool_response>
|
106 |
+
{{ .Content }}
|
107 |
+
</tool_response><|im_end|>
|
108 |
+
{{ end }}
|
109 |
+
{{- if and (ne .Role "assistant") $last }}<|im_start|>assistant
|
110 |
+
{{ end }}
|
111 |
+
{{- end }}
|
112 |
+
{{- else }}
|
113 |
+
{{- if .System }}<|im_start|>system
|
114 |
+
{{ .System }}<|im_end|>
|
115 |
+
{{ end }}{{ if .Prompt }}<|im_start|>user
|
116 |
+
{{ .Prompt }}<|im_end|>
|
117 |
+
{{ end }}<|im_start|>assistant
|
118 |
+
{{ end }}{{ .Response }}{{ if .Response }}<|im_end|>{{ end }}"""
|
119 |
+
|
120 |
+
# set the system message
|
121 |
+
SYSTEM """You are Qwen, merged by ZeroXClem. As such, you are a high quality assistant that excels in general question-answering tasks, code generation, and specialized cybersecurity domains."""
|
122 |
+
```
|
123 |
+
|
124 |
+
|
125 |
+
## ๐ Usage
|
126 |
|
127 |
```python
|
128 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|