---
base_model:
- testmoto/gemma-2-9b-lora-aozora
- google/gemma-2-9b
- testmoto/gemma-2-9b-platypus-02
- testmoto/gemma-2-9b-synthetic_coding
- testmoto/gemma-2-9b-lora-0
library_name: transformers
tags:
- mergekit
- merge

---
# merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [google/gemma-2-9b](https://huggingface.co/google/gemma-2-9b) as a base.

### Models Merged

The following models were included in the merge:
* [testmoto/gemma-2-9b-lora-aozora](https://huggingface.co/testmoto/gemma-2-9b-lora-aozora)
* [testmoto/gemma-2-9b-platypus-02](https://huggingface.co/testmoto/gemma-2-9b-platypus-02)
* ./fused_model
* [testmoto/gemma-2-9b-synthetic_coding](https://huggingface.co/testmoto/gemma-2-9b-synthetic_coding)
* [testmoto/gemma-2-9b-lora-0](https://huggingface.co/testmoto/gemma-2-9b-lora-0)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: google/gemma-2-9b
    # No parameters necessary for base model
  - model: ./fused_model
    parameters:
      density: 0.53
      weight: 0.40    
  - model: testmoto/gemma-2-9b-lora-0
    parameters:
      density: 0.53
      weight: 0.20
  - model: testmoto/gemma-2-9b-platypus-02
    parameters:
      density: 0.53
      weight: 0.15
  - model: testmoto/gemma-2-9b-lora-aozora
    parameters:
      density: 0.53
      weight: 0.15
  - model: testmoto/gemma-2-9b-synthetic_coding
    parameters:
      density: 0.53
      weight: 0.10      
merge_method: dare_ties
base_model: google/gemma-2-9b
parameters:
  int8_mask: true
dtype: bfloat16

```