Jellon's picture
Update README.md
a04b9c1 verified
metadata
base_model:
  - flammenai/Mahou-1.3-mistral-nemo-12B
  - nbeerbower/mistral-nemo-gutenberg-12B-v3
library_name: transformers
tags:
  - mergekit
  - merge

6bpw Quant of: https://huggingface.co/nbeerbower/Mahou-Gutenberg-Nemo-12B

Mahou-Gutenberg-Nemo-12B

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the della_linear merge method using flammenai/Mahou-1.3-mistral-nemo-12B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: flammenai/Mahou-1.3-mistral-nemo-12B
    parameters:
      weight: 0.5
      density: 0.8
  - model: nbeerbower/mistral-nemo-gutenberg-12B-v3
    parameters:
      weight: 0.5
      density: 0.8
merge_method: della_linear
base_model: flammenai/Mahou-1.3-mistral-nemo-12B
parameters:
  epsilon: 0.05
  lambda: 1
  int8_mask: true
dtype: bfloat16
tokenzer_source: union