MrRobotoAI commited on
Commit
da484c9
·
verified ·
1 Parent(s): 6e25cd9

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -2,6 +2,7 @@
2
  base_model:
3
  - MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k
4
  - Blackroot/Llama-3-LongStory-LORA
 
5
  - MrRobotoAI/MrRoboto-BASE-v1-8b-64k
6
  - Blackroot/Llama-3-LongStory-LORA
7
  - MrRobotoAI/Llama-3-8B-Uncensored-test8
@@ -24,6 +25,7 @@ This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge m
24
  ### Models Merged
25
 
26
  The following models were included in the merge:
 
27
  * [MrRobotoAI/MrRoboto-BASE-v1-8b-64k](https://huggingface.co/MrRobotoAI/MrRoboto-BASE-v1-8b-64k) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
28
  * [MrRobotoAI/Llama-3-8B-Uncensored-test8](https://huggingface.co/MrRobotoAI/Llama-3-8B-Uncensored-test8) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
29
 
@@ -35,8 +37,8 @@ The following YAML configuration was used to produce this model:
35
  models:
36
  - model: MrRobotoAI/MrRoboto-BASE-v1-8b-64k+Blackroot/Llama-3-LongStory-LORA
37
  parameters:
38
- density: 0.5
39
- weight: 0.9
40
  - model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
41
  parameters:
42
  density: 0.5
@@ -45,6 +47,10 @@ models:
45
  parameters:
46
  density: 0.5
47
  weight: 0.9
 
 
 
 
48
  merge_method: ties
49
  base_model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
50
  parameters:
 
2
  base_model:
3
  - MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k
4
  - Blackroot/Llama-3-LongStory-LORA
5
+ - gradientai/Llama-3-8B-Instruct-Gradient-4194k
6
  - MrRobotoAI/MrRoboto-BASE-v1-8b-64k
7
  - Blackroot/Llama-3-LongStory-LORA
8
  - MrRobotoAI/Llama-3-8B-Uncensored-test8
 
25
  ### Models Merged
26
 
27
  The following models were included in the merge:
28
+ * [gradientai/Llama-3-8B-Instruct-Gradient-4194k](https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-4194k)
29
  * [MrRobotoAI/MrRoboto-BASE-v1-8b-64k](https://huggingface.co/MrRobotoAI/MrRoboto-BASE-v1-8b-64k) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
30
  * [MrRobotoAI/Llama-3-8B-Uncensored-test8](https://huggingface.co/MrRobotoAI/Llama-3-8B-Uncensored-test8) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
31
 
 
37
  models:
38
  - model: MrRobotoAI/MrRoboto-BASE-v1-8b-64k+Blackroot/Llama-3-LongStory-LORA
39
  parameters:
40
+ density: 0.2
41
+ weight: 0.4
42
  - model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
43
  parameters:
44
  density: 0.5
 
47
  parameters:
48
  density: 0.5
49
  weight: 0.9
50
+ - model: gradientai/Llama-3-8B-Instruct-Gradient-4194k
51
+ parameters:
52
+ density: 0.2
53
+ weight: 0.4
54
  merge_method: ties
55
  base_model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
56
  parameters:
mergekit_config.yml CHANGED
@@ -1,8 +1,8 @@
1
  models:
2
  - model: MrRobotoAI/MrRoboto-BASE-v1-8b-64k+Blackroot/Llama-3-LongStory-LORA
3
  parameters:
4
- density: 0.5
5
- weight: 0.9
6
  - model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
7
  parameters:
8
  density: 0.5
@@ -11,6 +11,10 @@ models:
11
  parameters:
12
  density: 0.5
13
  weight: 0.9
 
 
 
 
14
  merge_method: ties
15
  base_model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
16
  parameters:
 
1
  models:
2
  - model: MrRobotoAI/MrRoboto-BASE-v1-8b-64k+Blackroot/Llama-3-LongStory-LORA
3
  parameters:
4
+ density: 0.2
5
+ weight: 0.4
6
  - model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
7
  parameters:
8
  density: 0.5
 
11
  parameters:
12
  density: 0.5
13
  weight: 0.9
14
+ - model: gradientai/Llama-3-8B-Instruct-Gradient-4194k
15
+ parameters:
16
+ density: 0.2
17
+ weight: 0.4
18
  merge_method: ties
19
  base_model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
20
  parameters:
model-00001-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:54f29aada6dd6f067acc5c0dd9b72090a80f6dfc6725b5fe22ec319cbb01b24e
3
  size 4953586384
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:898ecff806523e45d7c98fbcc2f1970672db71b9036356c8413ffceec94cbb06
3
  size 4953586384
model-00002-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9cdfba1abfe545ef05932beb65dcc7e9015688acaf51850c1c5b4652380fd8b9
3
  size 4999819336
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c7d8999c7f8b985aa2b8e89843f9642f8f15fae4f381f8c6c920d8d1060108ff
3
  size 4999819336
model-00003-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c9daeaa2a9c2b1495784b0ff79df5598deb770c680a46244388b80006bdbe999
3
  size 4915916144
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3f02cab3c9a19527bfc112e2fc2cd139697e5df61fd9204cd3b929d87dbb42e0
3
  size 4915916144
model-00004-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e982af3cbace9a66d6527979233607c631e34b88b5a207afa0792f638302b371
3
  size 1191234472
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a30c6f7805c3117167b3cba358a6a412d89c5accbcdfa5dd46a17dcc224c0ce4
3
  size 1191234472