--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - /content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained - /content/drive/MyDrive/llama3_label_rationale_pretrained3 --- # Llama-3-8B-NLI-ties2 Llama-3-8B-NLI-ties2 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [/content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained](https://huggingface.co//content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained) * [/content/drive/MyDrive/llama3_label_rationale_pretrained3](https://huggingface.co//content/drive/MyDrive/llama3_label_rationale_pretrained3) ## 🧩 Configuration ```yaml models: - model: /content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained parameters: density: 1 weight: 0.5 - model: /content/drive/MyDrive/llama3_label_rationale_pretrained3 parameters: density: 1 weight: 0.5 # - model: WizardLM/WizardMath-13B-V1.0 # parameters: # density: 0.33 # weight: # - filter: mlp # value: 0.5 # - value: 0 merge_method: ties base_model: /content/drive/MyDrive/Meta-Llama-3-8B-Instruct parameters: normalize: true int8_mask: true dtype: float16 ```