--- base_model: [] library_name: transformers tags: - mergekit - peft --- # Untitled LoRA Model (1) This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit). ## LoRA Details This LoRA adapter was extracted from /home/ubuntu/models/magnum-v2-12b and uses /home/ubuntu/models/Mistral-Nemo-Base-2407 as a base. ### Parameters The following command was used to extract this LoRA adapter: ```sh mergekit-extract-lora /home/ubuntu/models/magnum-v2-12b /home/ubuntu/models/Mistral-Nemo-Base-2407 OUTPUT_PATH --rank=32 --save-module=lm_head --save-module=embed_tokens ```