Unable to find target modules for PEFT finetuning.
I have trying to finetune codestral for my usecase. i tried to use peft and lora but it throws an error stating you need to specify "target_modules". It turns out that normal target_modules like [proj, q_proj, k_proj] are not present in the model. i tried some of the modules inside the model, but it throws an error of not being supported.
can someone help me with this and help me find the proper target_modules.
thanks
Mamba-Codestral-7B-v0.1 is a model with "model_type" set to "mamba2" (https://huggingface.co/mistralai/Mamba-Codestral-7B-v0.1/blob/main/config.json).
Therefore, it doesn't use the same target modules as LLaMA-based models. Instead, it has different modules such as ["embeddings", "in_proj", "out_proj"].
Here's an example of how to fine-tune it: https://huggingface.co/docs/transformers/model_doc/mamba2#usage