merge

This is a merge of pre-trained language models created using mergekit.

Merge Method

This model was merged using the TIES merge method using unsloth/Mistral-Small-24B-Base-2501 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: unsloth/Mistral-Small-24B-Base-2501
    #no parameters necessary for base model
  - model: SicariusSicariiStuff/Redemption_Wind_24B
    parameters:
      density: 0.5
      weight: 0.5
  - model: cognitivecomputations/Dolphin3.0-R1-Mistral-24B
    parameters:
      density: 0.5
      weight: 0.5

merge_method: ties
base_model: unsloth/Mistral-Small-24B-Base-2501
parameters:
  normalize: false
  int8_mask: true
dtype: float16

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 32.79
IFEval (0-Shot) 40.29
BBH (3-Shot) 46.28
MATH Lvl 5 (4-Shot) 41.01
GPQA (0-shot) 12.98
MuSR (0-shot) 17.17
MMLU-PRO (5-shot) 39.00
Downloads last month
50
Safetensors
Model size
23.6B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Triangle104/Mistral-Redemption-Arc

Collections including Triangle104/Mistral-Redemption-Arc

Evaluation results