prsyahmi commited on
Commit
3168fcc
·
1 Parent(s): 250a107

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +66 -0
README.md ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: mesolitica/malaysian-mistral-7b-32k-instructions-v2
3
+ inference: false
4
+ model_creator: mesolitica
5
+ model_name: Malaysian Mistral 7B 32k Instructions v2
6
+ model_type: mistral
7
+ pipeline_tag: text-generation
8
+ prompt_template: '<s>[INST] {prompt} [/INST]
9
+
10
+ '
11
+ quantized_by: prsyahmi
12
+ tags:
13
+ - finetuned
14
+ ---
15
+ <!-- markdownlint-disable MD041 -->
16
+
17
+ <!-- header start --><!-- header end -->
18
+
19
+ # Malaysian Mistral 7B 32k Instructions - GGUF
20
+ - Model creator: [mesolotica](https://huggingface.co/mesolitica)
21
+ - Original model: [Malaysian Mistral 7B 32k Instructions v2](https://huggingface.co/mesolitica/malaysian-mistral-7b-32k-instructions-v2)
22
+
23
+ <!-- description start -->
24
+ ## Pengenalan
25
+ Repo ini mengandungi model berformat GGUF untuk [mesolitica's Malaysian Mistral 7B 32k Instructions v2](https://huggingface.co/mesolitica/malaysian-mistral-7b-32k-instructions-v2).
26
+
27
+ GGUF adalah format kepada llama.cpp yang dibangunkan menggunakan C/C++ dimana pergantungan dengan software/library lain kurang menjadikan aplikasi ini ringan berbanding rata-rata aplikasi python.
28
+
29
+ <!-- description end -->
30
+
31
+ <!-- prompt-template start -->
32
+ ## Prompt template: Mistral
33
+ ```
34
+ <s>[INST] {prompt} [/INST]
35
+
36
+ ```
37
+
38
+ <!-- prompt-template end -->
39
+
40
+
41
+ <!-- README_GGUF.md-provided-files start -->
42
+ ## Fail yang diberikan
43
+ | Nama | Kaedah Quant | Saiz Fail |
44
+ | ---- | ---- | ---- |
45
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q2_K.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q2_K.gguf) | Q2_K | 2.86 GB |
46
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q3_K_M.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q3_K_M.gguf) | Q3_K_M | 3.27 GB |
47
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q4_K_S.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q4_K_S.gguf) | Q4_K_S | 3.86 GB |
48
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q4_K_M.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q4_K_M.gguf) | Q4_K_M | 4.06 GB |
49
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q5_K_M.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q5_K_M.gguf) | Q5_K_M | 4.77 GB |
50
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q6_K.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.Q6_K.gguf) | Q6_K | 5.53 GB |
51
+ | [malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.fp16.gguf](https://huggingface.co/prsyahmi/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700-GGUF/blob/main/malaysian-mistral-7b-32k-instructions-v2-ckpt-1700.fp16.gguf) | FP16 | 13.5 GB |
52
+
53
+ <!-- README_GGUF.md-provided-files end -->
54
+
55
+ ## Penghargaan
56
+ Terima kasih kepada Husein Zolkepli dan keseluruhan team [mesolotica](https://huggingface.co/mesolitica)!
57
+
58
+ Atas jasa mereka, kita dapat menggunakan atau mencuba AI peringkat tempatan.
59
+
60
+ <!-- footer end -->
61
+
62
+ -------
63
+
64
+ <!-- original-model-card start -->
65
+
66
+ <!-- original-model-card end -->