model / adapter_config.json
shashikanth-a's picture
(Trained with Unsloth)
530139e verified
raw
history blame contribute delete
139 Bytes
{
"lora_parameters": {
"rank": 4,
"alpha": 8,
"dropout": 0.1,
"scale": 4.0
},
"num_layers": 4
}