AshtonIsNotHere commited on
Commit
581b14d
1 Parent(s): ba9e738

Change max_position_embeddings to original value

Browse files

max_position_embeddings in this config was set to 16384, which seems to be an error (this is exactly the same as hidden_size) and should actually be 131072. In both the [original model config](https://huggingface.co/meta-llama/Llama-3.1-405B/blob/main/config.json) and the [AWQ-quantized config](https://huggingface.co/hugging-quants/Meta-Llama-3.1-405B-Instruct-AWQ-INT4/blob/main/config.json), the value for max_position_embeddings is 131072.

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -14,7 +14,7 @@
14
  "hidden_size": 16384,
15
  "initializer_range": 0.02,
16
  "intermediate_size": 53248,
17
- "max_position_embeddings": 16384,
18
  "mlp_bias": false,
19
  "model_type": "llama",
20
  "num_attention_heads": 128,
 
14
  "hidden_size": 16384,
15
  "initializer_range": 0.02,
16
  "intermediate_size": 53248,
17
+ "max_position_embeddings": 131072,
18
  "mlp_bias": false,
19
  "model_type": "llama",
20
  "num_attention_heads": 128,