Dark-Planet-EOP-r256-LoRA
This is a LoRA extracted from a language model. It was extracted using mergekit.
LoRA Details
This LoRA adapter was extracted from DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power and uses NousResearch/Meta-Llama-3-8B as a base.
Parameters
The following command was used to extract this LoRA adapter:
/usr/local/bin/mergekit-extract-lora --out-path=loras/Dark-Planet-EOP-r256-LoRA --model=DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power --base-model=NousResearch/Meta-Llama-3-8B --no-lazy-unpickle --max-rank=256 --gpu-rich -v --embed-lora --skip-undecomposable
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.