regular-sae / config_layer_9.json
charlieoneill's picture
Upload config_layer_9.json with huggingface_hub
d6b0777 verified
raw
history blame
339 Bytes
{
"layer": 9,
"model_type": "GatedSAE",
"n_batches": 2000,
"l1_coefficient": 2.5,
"projection_up": 16,
"batch_size": 64,
"learning_rate": 0.001,
"test_loss": 58.03027763366699,
"reconstruction_error": 28.543010902404784,
"l0_loss": 2.2675048828125,
"dead_neurons_percentage": 84.30989583333334
}