metadata
{}
license: apache-2.0
tags:
- merge
- mergekit
- Ahmad0067/llama-3-8b-Instruct-Referral_Synth_data_Phase_1_and_2_corect_unsloth_merged
- Ahmad0067/llama-3-8b-Instruct-Bloodwork_Specialist_Synth_data_Phase_1_and_2_corect_unsloth_merged
- Ahmad0067/llama-3-8b-Instruct-Prescriptin_Synth_data_Phase_1_and_2_corect_unsloth_merged
---
# llama-3-8b-Instruct-TIES_merged-ref-blood-pres
llama-3-8b-Instruct-TIES_merged-ref-blood-pres is a merge of the following models using [mergekit](https://github.com/arcee-ai/mergekit):
* [Ahmad0067/llama-3-8b-Instruct-Referral_Synth_data_Phase_1_and_2_corect_unsloth_merged](https://huggingface.co/Ahmad0067/llama-3-8b-Instruct-Referral_Synth_data_Phase_1_and_2_corect_unsloth_merged)
* [Ahmad0067/llama-3-8b-Instruct-Bloodwork_Specialist_Synth_data_Phase_1_and_2_corect_unsloth_merged](https://huggingface.co/Ahmad0067/llama-3-8b-Instruct-Bloodwork_Specialist_Synth_data_Phase_1_and_2_corect_unsloth_merged)
* [Ahmad0067/llama-3-8b-Instruct-Prescriptin_Synth_data_Phase_1_and_2_corect_unsloth_merged](https://huggingface.co/Ahmad0067/llama-3-8b-Instruct-Prescriptin_Synth_data_Phase_1_and_2_corect_unsloth_merged)
## 🧩 Configuration
```yaml
models:
- model: Ahmad0067/llama-3-8b-Instruct-Referral_Synth_data_Phase_1_and_2_corect_unsloth_merged parameters: density: 0.33 weight: 0.33
- model: Ahmad0067/llama-3-8b-Instruct-Bloodwork_Specialist_Synth_data_Phase_1_and_2_corect_unsloth_merged parameters: density: 0.33 weight: 0.33
- model: Ahmad0067/llama-3-8b-Instruct-Prescriptin_Synth_data_Phase_1_and_2_corect_unsloth_merged parameters: density: 0.34 weight: 0.34 merge_method: ties base_model: unsloth/llama-3-8b-Instruct parameters: normalize: true int8_mask: true dtype: float16