Ggg / adapters_adapter_config.json
farikaw599's picture
Upload adapters_adapter_config.json
09fb3c8 verified
raw
history blame contribute delete
174 Bytes
{
"adapter_path": "adapters",
"lora_layers": 8,
"lora_parameters": {
"rank": 16,
"alpha": 16,
"dropout": 0.0,
"scale": 1.0
}
}