NeuralPipe-7B-ties / README.md
mlabonne's picture
Update README.md
1c58cda
|
raw
history blame
1.09 kB
metadata
license: apache-2.0
tags:
  - merge

NeuralPipe-7B-ties

This model is a merge of the following models made with mergekit:

⚡ Quantized models

Thanks to TheBloke for the quantized models:

🧩 Configuration

models:
  - model: mistralai/Mistral-7B-v0.1
    # no parameters necessary for base model
  - model: OpenPipe/mistral-ft-optimized-1218
    parameters:
      density: 0.5
      weight: 0.5
  - model: mlabonne/NeuralHermes-2.5-Mistral-7B
    parameters:
      density: 0.5
      weight: 0.3
merge_method: ties
base_model: mistralai/Mistral-7B-v0.1
parameters:
  normalize: true
  int8_mask: true
dtype: float16