L3.1-Flagrantv1-12B / README.md
kromeurus's picture
Upload folder using huggingface_hub
b41d7f2 verified
metadata
base_model:
  - kromcomp/L3.1-Adumbralv1-12B
  - kromcomp/L3.1-Punicav1-12B
  - kromcomp/L3.1-Eleusis.exv5-12B
  - kromcomp/L3.1-Fragrantv1-12B
library_name: transformers
tags:
  - mergekit
  - merge

flagrant

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using kromcomp/L3.1-Fragrantv1-12B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: kromcomp/L3.1-Fragrantv1-12B
chat_template: llama3
dtype: float32
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 0.0
slices:
- sources:
  - layer_range: [0, 50]
    model: kromcomp/L3.1-Eleusis.exv5-12B
    parameters:
      density: 0.5
      lambda: 0.7
      weight: 0.25
  - layer_range: [0, 50]
    model: kromcomp/L3.1-Punicav1-12B
    parameters:
      density: 0.5
      lambda: 1.0
      weight: 0.25
  - layer_range: [0, 50]
    model: kromcomp/L3.1-Adumbralv1-12B
    parameters:
      density: 0.5
      lambda: 0.7
      weight: 0.25
  - layer_range: [0, 50]
    model: kromcomp/L3.1-Fragrantv1-12B
    parameters:
      density: 0.5
      lambda: 1.0
      weight: 0.25
tokenizer: {}