--- tags: - llama - merge --- Healed Llama-3-15B ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f74b6e6389380c77562762/FwYXt2h_FdmlL0Z6qYufz.png) # Llama-3-15B-EtherealMaid-t0.0001 ## A merge of the following models using a custom NearSwap(t0.0001) algorithm: * [v000000/HaloMaidRP-v1.33-15B-L3](https://huggingface.co/v000000/HaloMaidRP-v1.33-15B-L3) * [ZeusLabs/L3-Aethora-15B-V2](https://huggingface.co/ZeusLabs/L3-Aethora-15B-V2) With [v000000/HaloMaidRP-v1.33-15B-L3](https://huggingface.co/v000000/HaloMaidRP-v1.33-15B-L3) as the base model. ## Thanks mradermacher for the quants! * [GGUF](https://huggingface.co/mradermacher/L3-15B-EtherealMaid-t0.0001-GGUF) * [GGUF imatrix](https://huggingface.co/mradermacher/L3-15B-EtherealMaid-t0.0001-i1-GGUF) https://huggingface.co/v000000/L3-15B-EtherealMaid-t0.0001-Q5_K_M-GGUF ```python #Fixed def lerp(a, b, t): return a * (1 - t) + b * t def nearswap(v0, v1, t): lweight = np.abs(v0 - v1) with np.errstate(divide='ignore', invalid='ignore'): lweight = np.where(lweight != 0, t / lweight, 1.0) lweight = np.nan_to_num(lweight, nan=1.0, posinf=1.0, neginf=1.0) np.clip(lweight, a_min=0.0, a_max=1.0, out=lweight) return lerp(v0, v1, lweight) ``` Credit Alchemonaut Samplers ``` I found success with: temperature 0.9-1.2 min_p 0.08 tfs 0.97 smoothing_factor 0.3 smoothing_curve 1.1 Nymeria preset (more coherent): temp 0.9 top_k 30 top_p 0.75 min_p 0.2 rep_pen 1.1 smooth_factor 0.25 smooth_curve 1 ``` Prompt Template ```bash <|begin_of_text|><|start_header_id|>system<|end_header_id|> {system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|> {input}<|eot_id|><|start_header_id|>assistant<|end_header_id|> {output}<|eot_id|> ```