Avalon_8 / README.md
Aleteian's picture
Upload folder using huggingface_hub
eafd426 verified
metadata
base_model:
  - DavidAU/MN-Dark-Planet-TITAN-12B
  - Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
  - elinas/Chronos-Gold-12B-1.0
  - TheDrummer/Rocinante-12B-v1
  - win10/Mistral-Nemo-abliterated-Nemo-Pro-v2
  - Gryphe/Pantheon-RP-1.5-12b-Nemo
  - nothingiisreal/MN-12B-Celeste-V1.9
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the della merge method using win10/Mistral-Nemo-abliterated-Nemo-Pro-v2 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: win10/Mistral-Nemo-abliterated-Nemo-Pro-v2
    parameters:
      density: 0.11
      weight: 0.1

  - model: DavidAU/MN-Dark-Planet-TITAN-12B
    parameters:
      density: 0.8
      weight: 1.0      
      
  - model: nothingiisreal/MN-12B-Celeste-V1.9
    parameters:
      density: 0.8
      weight: 1.0
      
  - model: elinas/Chronos-Gold-12B-1.0
    parameters:
      density: 0.8
      weight: 1.0
          
  - model: TheDrummer/Rocinante-12B-v1
    parameters:
      density: 0.8
      weight: 1.0
      
  - model: Gryphe/Pantheon-RP-1.5-12b-Nemo
    parameters:
      density: 0.8
      weight: 1.0
      
  - model: Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
    parameters:
      density: 0.8
      weight: 1.0
      
      
merge_method: della
base_model: win10/Mistral-Nemo-abliterated-Nemo-Pro-v2

parameters:
  epsilon: 0.10
  lambda: 1.00
  int8_mask: true
  
dtype: float16