MrRobotoAI's picture
Upload folder using huggingface_hub
da484c9 verified
metadata
base_model:
  - MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k
  - Blackroot/Llama-3-LongStory-LORA
  - gradientai/Llama-3-8B-Instruct-Gradient-4194k
  - MrRobotoAI/MrRoboto-BASE-v1-8b-64k
  - Blackroot/Llama-3-LongStory-LORA
  - MrRobotoAI/Llama-3-8B-Uncensored-test8
  - Blackroot/Llama-3-LongStory-LORA
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k + Blackroot/Llama-3-LongStory-LORA as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: MrRobotoAI/MrRoboto-BASE-v1-8b-64k+Blackroot/Llama-3-LongStory-LORA
    parameters:
      density: 0.2
      weight: 0.4
  - model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
    parameters:
      density: 0.5
      weight: 0.9
  - model: MrRobotoAI/Llama-3-8B-Uncensored-test8+Blackroot/Llama-3-LongStory-LORA
    parameters:
      density: 0.5
      weight: 0.9
  - model: gradientai/Llama-3-8B-Instruct-Gradient-4194k
    parameters:
      density: 0.2
      weight: 0.4
merge_method: ties
base_model: MrRobotoAI/MrRoboto-BASE-v2.1-8b-64k+Blackroot/Llama-3-LongStory-LORA
parameters:
  int8_mask: true
  rescale: true
  normalize: false
dtype: bfloat16