MidnightMiqu

Overview

This is a 103B frankenmerge of sophosympatheia/Midnight-Miqu-70B-v1.5 with itself. Please see that model card for details and usage instructions. This model is based on Miqu so it's capable of 32K context.

Quantizations

Licence and usage restrictions

152334H/miqu-1-70b-sf was based on a leaked version of one of Mistral's models. All miqu-derived models, including this merge, are only suitable for personal use. Mistral has been cool about it so far, but you should be aware that by downloading this merge you are assuming whatever legal risk is iherent in acquiring and using a model based on leaked weights. This merge comes with no warranties or guarantees of any kind, but you probably already knew that. I am not a lawyer and I do not profess to know what we have gotten ourselves into here. You should consult with a lawyer before using any Hugging Face model beyond private use... but definitely don't use this one for that!

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: sophosympatheiaMidnight-Miqu-70B-v1.5
        layer_range: [0, 40] # 40
  - sources:
      - model: sophosympatheiaMidnight-Miqu-70B-v1.5
        layer_range: [20, 60] # 40
  - sources:
      - model: sophosympatheiaMidnight-Miqu-70B-v1.5
        layer_range: [40, 80] # 40
merge_method: passthrough
dtype: float16
Downloads last month
336
Safetensors
Model size
103B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for FluffyKaeloky/Midnight-Miqu-103B-v1.5

Quantizations
2 models