H16-dh16 / config.json
jqhoogland's picture
Upload final model (step 75000) and all checkpoints at 2024-10-17T23:50:46.182691
b3fe974 verified
{
"architectures": [
"HFHookedTransformer"
],
"hidden_size": 256,
"num_attention_heads": 16,
"num_hidden_layers": 2,
"torch_dtype": "float32",
"transformers_version": "4.45.2",
"vocab_size": 5000
}