MiniusLight-24B

12B - 24B-v1 - 24B-v1.01

cover image Origin Content (Click Here)

What is this?

Another simple Slerp merge of 2 Mistral "Small" model and well-known HuggingFace users, TheDrummer/Cydonia-24B-v2 & PocketDoc/Dans-PersonalityEngine-V1.2.0-24b.

This version is just a bit lower eval scores compare to v1, so I keep it. I don't know if this version better than v1 or not, but I decide to name it as v1.01.

Overall, nice to try model, if you want to try. :)

Other information

Chat Template? ChatML, of course!

Merge Method

Detail YAML Config
  {
  models:
    - model: TheDrummer/Cydonia-24B-v2
    - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
  merge_method: slerp
  base_model: TheDrummer/Cydonia-24B-v2
  parameters:
    t: [0.1, 0.2, 0.4, 0.6, 0.6, 0.4, 0.2, 0.1]
  dtype: bfloat16
  tokenizer_source: base
  }
              

Downloads last month
88
Safetensors
Model size
23.6B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for DoppelReflEx/MiniusLight-24B-v1.01

Collection including DoppelReflEx/MiniusLight-24B-v1.01