I am experimenting with some of the DELLA merge method parameters.
Progenitor-V2.3-70B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DELLA merge method using nbeerbower/Llama-3.1-Nemotron-lorablated-70B as a base.
Models Merged
The following models were included in the merge:
EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1 Sao10K/L3.1-70B-Hanami-x1 Sao10K/70B-L3.3-Cirrus-x1 TheDrummer/Anubis-70B-v1 SicariusSicariiStuff/Negative_LLAMA_70B
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Sao10K/L3.1-70B-Hanami-x1
parameters:
weight: 0.20
- model: Sao10K/70B-L3.3-Cirrus-x1
parameters:
weight: 0.20
- model: SicariusSicariiStuff/Negative_LLAMA_70B
parameters:
weight: 0.20
- model: TheDrummer/Anubis-70B-v1
parameters:
weight: 0.20
- model: EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1
parameters:
weight: 0.20
merge_method: della
base_model: nbeerbower/Llama-3.1-Nemotron-lorablated-70B
parameters:
density: 0.7
epsilon: 0.15
lambda: 1.1
rescale: 1
window_size: 0.14
dtype: float32
out_dtype: bfloat16
tokenizer:
source: union
- Downloads last month
- 75
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for TareksLab/Progenitor-V2.3-LLaMa-70B
Merge model
this model