LoRA Adapters

This repository contains the LoRA adapters used for fine-tuning.

Details

  • Base Model: Bronsn/gemma-9b-luganda-pretrained
  • Contains LoRA adapter weights
  • Compatible with PEFT library

Configuration

peft_config = LoraConfig(
    r=128,
    target_modules=["q_proj", "k_proj", "v_proj", "o_proj",
                   "gate_proj", "up_proj", "down_proj",
                   "embed_tokens", "lm_head"],
    lora_alpha=32,
    bias="none",
)
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Bronsn/luganda-english-translation-lora

Base model

google/gemma-2-9b
Adapter
(1)
this model

Collection including Bronsn/luganda-english-translation-lora