pwei07 commited on
Commit
ae0ff3e
1 Parent(s): a634fa0

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - merge
5
+ - mergekit
6
+ - lazymergekit
7
+ - /content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained
8
+ - /content/drive/MyDrive/llama3_label_rationale_pretrained3
9
+ ---
10
+
11
+ # Llama-3-8B-NLI-ties2
12
+
13
+ Llama-3-8B-NLI-ties2 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
14
+ * [/content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained](https://huggingface.co//content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained)
15
+ * [/content/drive/MyDrive/llama3_label_rationale_pretrained3](https://huggingface.co//content/drive/MyDrive/llama3_label_rationale_pretrained3)
16
+
17
+ ## 🧩 Configuration
18
+
19
+ ```yaml
20
+ models:
21
+ - model: /content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained
22
+ parameters:
23
+ density: 1
24
+ weight: 0.5
25
+ - model: /content/drive/MyDrive/llama3_label_rationale_pretrained3
26
+ parameters:
27
+ density: 1
28
+ weight: 0.5
29
+ # - model: WizardLM/WizardMath-13B-V1.0
30
+ # parameters:
31
+ # density: 0.33
32
+ # weight:
33
+ # - filter: mlp
34
+ # value: 0.5
35
+ # - value: 0
36
+ merge_method: ties
37
+ base_model: /content/drive/MyDrive/Meta-Llama-3-8B-Instruct
38
+ parameters:
39
+ normalize: true
40
+ int8_mask: true
41
+ dtype: float16
42
+ ```