evo_exp-point-1-3-ties

evo_exp-point-1-3-ties is a merge of the following models using mergekit:

🧩 Configuration

models:
  - model: openchat/openchat-3.5-1210
    parameters:
      density: 0.5
      weight: 0.5
  - model: meta-math/MetaMath-Mistral-7B
    parameters:
      density: 0.5
      weight: 0.5
merge_method: ties
base_model: mistralai/Mistral-7B-v0.1
parameters:
  normalize: true
dtype: float16
Downloads last month
3
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference API
Unable to determine this model's library. Check the docs .