dm128 / config.json
jqhoogland's picture
Upload final model (step 75000) and all checkpoints at 2024-10-18T04:43:26.989679
79a3a01 verified
{
"architectures": [
"HFHookedTransformer"
],
"hidden_size": 128,
"num_attention_heads": 8,
"num_hidden_layers": 2,
"torch_dtype": "float32",
"transformers_version": "4.45.2",
"vocab_size": 5000
}