chargoddard's picture
Upload folder using huggingface_hub
457d252 verified
|
raw
history blame
1.8 kB
metadata
base_model:
  - bamec66557/MISCHIEVOUS-12B-Mix_0.4v
  - Infermatic/MN-12B-Inferor-v0.1
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: bamec66557/MISCHIEVOUS-12B-Mix_0.4v
        layer_range: [0, 20]
      - model: Infermatic/MN-12B-Inferor-v0.1
        layer_range: [0, 20]
    parameters:
      t:
        - value: 0.8

  - sources:
      - model: bamec66557/MISCHIEVOUS-12B-Mix_0.4v
        layer_range: [20, 40]
      - model: Infermatic/MN-12B-Inferor-v0.1
        layer_range: [20, 40]
    parameters:
      t:
        - value: 1.0
        - filter: self_attn
          value: [0.8, 0.9, 1.0, 1.1, 1.2]

merge_method: slerp  # Preserve merge method

base_model: bamec66557/MISCHIEVOUS-12B-Mix_0.4v  # Base model

dtype: bfloat16  # Data types for fast merges

# Additional options
regularization:
  - method: weight_clipping
    clip_range: [-0.1, 0.1]

postprocessing:
  - operation: gaussian_smoothing
    sigma: 1.5  # Gaussian smoothing intensity
  - operation: smoothing
    parameters:
      adaptive: true
      range: [0.8, 1.2]  # Adaptively adjust
      kernel_size: 5  # Smoothing larger ranges with increased kernel size
  - operation: normalize  # Normalise after merge