File size: 4,301 Bytes
1c83592
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9456956
1c83592
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0cf32e7
 
1c83592
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e9efe6d
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
base_model:
- anthracite-org/magnum-v3-9b-customgemma2
- nbeerbower/gemma2-gutenberg-9B
- grimjim/Magnolia-v1-Gemma2-8k-9B
- UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
- BeaverLegacy/Smegmma-Deluxe-9B-v1
- ifable/gemma-2-Ifable-9B
library_name: transformers
tags:
- mergekit
- merge

---

![Image from google images](https://cdn-lfs-us-1.hf.co/repos/18/09/180999b41a1608d2b6cc42a0390d6443b458650f46f9272f446133b029c7c3e1/da5496d25fce344d4251a87cc4dae68b39c80251ebb51f246e3f3f7e94dcdf8c?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27aster.jpg%3B+filename%3D%22aster.jpg%22%3B&response-content-type=image%2Fjpeg&Expires=1729458155&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcyOTQ1ODE1NX19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmhmLmNvL3JlcG9zLzE4LzA5LzE4MDk5OWI0MWExNjA4ZDJiNmNjNDJhMDM5MGQ2NDQzYjQ1ODY1MGY0NmY5MjcyZjQ0NjEzM2IwMjljN2MzZTEvZGE1NDk2ZDI1ZmNlMzQ0ZDQyNTFhODdjYzRkYWU2OGIzOWM4MDI1MWViYjUxZjI0NmUzZjNmN2U5NGRjZGY4Yz9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSomcmVzcG9uc2UtY29udGVudC10eXBlPSoifV19&Signature=LR4Qtaxn8KGxx0sYfP4YqVziM38FcYTAyz0FLB7-PFEG9ffiQVQzNSp0d0sBH1CHEOxWF-A8-yyRxau9hUKnXeChYwS5aud8SzpyiU-F0qR9pDkz2dP5MIeU28BuTb4h1GIa2PumTNAte74G5-komB23YS0V1YRcfXhhd8vphG0HKjq24aJW6f2cDqUQ%7E6i9BsYvgzkXKWGPHwLPr%7EhjuB%7EI4QKbnryJXpCDMda52n3auwgEHPhQb%7E7BETVjhzTATW2eBBZCRoXIrlxH92sJhknA7LKtSgNFhHEke8FZzosfNS12Sk41e39HJB9DC4dc4KPLRZr5Tbdcz88uq1vmqw__&Key-Pair-Id=K24J24Z295AEI9)

# merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the SLERP method to create an intermediate model. I used the
[Model Stock](https://arxiv.org/abs/2403.19522) merge method after, using the SLERP model as a base.

The idea was to make a nice and smart base model and add in a few pinches of spice.

For some reason it wouldn't let me use any other merge method- it gave me ModelReference errors about my
intermediary model for every method except Model Stock for some reason. I'll see if I can fix it and
upload my intended task-arithmetic version as a v2.

This is the only one of my like 700 merges that I think uses something novel/interesting
enough in its creation to merit an upload. 

Named after the **aster**, a purple-violet star-shaped perennial flower. It's pretty and has a huge family, much like this model.

### Models Merged

The following models were included in the merge:
* [anthracite-org/magnum-v3-9b-customgemma2](https://huggingface.co/anthracite-org/magnum-v3-9b-customgemma2)
* [nbeerbower/gemma2-gutenberg-9B](https://huggingface.co/nbeerbower/gemma2-gutenberg-9B)

* [grimjim/Magnolia-v1-Gemma2-8k-9B](https://huggingface.co/grimjim/Magnolia-v1-Gemma2-8k-9B)
* [UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3](https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3)
* [BeaverLegacy/Smegmma-Deluxe-9B-v1](https://huggingface.co/BeaverLegacy/Smegmma-Deluxe-9B-v1)
* [ifable/gemma-2-Ifable-9B](https://huggingface.co/ifable/gemma-2-Ifable-9B)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
# THIS YAML CONFIGURATION WAS USED TO CREATE THE INTERMEDIARY MODEL.
# slices:
#   - sources:
#     - model: anthracite-org/magnum-v3-9b-customgemma2
#       layer_range: [0, 42]
#     - model: nbeerbower/gemma2-gutenberg-9B
#       layer_range: [0, 42]
# merge_method: slerp
# base_model: nbeerbower/gemma2-gutenberg-9B
# parameters:
#   t:
#     - filter: self_attn
#       value: [0.2, 0.5, 0.4, 0.7, 1]
#     - filter: mlp
#       value: [1, 0.5, 0.3, 0.4, 0.2]
#     - value: 0.5
# dtype: float16

# THIS YAML CONFIGURATION WAS USED TO CREATE ASTER. The E: model is the intermediate
# model created in the previous config.
models:
  - model: E:/models/mergekit/output/intermediate/
  - model: BeaverLegacy/Smegmma-Deluxe-9B-v1
    parameters:
      weight: 0.3
  - model: ifable/gemma-2-Ifable-9B
    parameters:
      weight: 0.3
  - model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
    parameters:
      weight: 0.15
  - model: grimjim/Magnolia-v1-Gemma2-8k-9B
    parameters:
      weight: 0.25
merge_method: model_stock
base_model: E:/models/mergekit/output/intermediate/
dtype: float16
```

Alright, now back to smashing models together and seeing what happens...