--- base_model: - gradientai/Llama-3-70B-Instruct-Gradient-262k - meta-llama/Meta-Llama-3-70B-Instruct library_name: transformers tags: - mergekit - peft --- # Untitled LoRA Model (1) This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit). ## LoRA Details This LoRA adapter was extracted from [meta-llama/Meta-Llama-3-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct) and uses [gradientai/Llama-3-70B-Instruct-Gradient-262k](https://huggingface.co/gradientai/Llama-3-70B-Instruct-Gradient-262k) as a base. ### Parameters The following command was used to extract this LoRA adapter: ```sh mergekit-extract-lora gradientai/Llama-3-70B-Instruct-Gradient-262k meta-llama/Meta-Llama-3-70B-Instruct OUTPUT_PATH --rank=32 ```