llava-v1.6-mistral-7b-hf_CB / lorra_config.json
shariar076's picture
Upload folder using huggingface_hub
372b1ac verified
raw
history blame contribute delete
147 Bytes
{
"target_layers": "16",
"transform_layers": "14,15,16",
"lorra_alpha": 5.0,
"trainsets": null,
"valsets": null,
"full_layers": false
}