library_name: transformers | |
tags: [] | |
### Input, output embedding weights are possibly not tied - | |
Some weights of LlamaForCausalLM were not initialized from the model checkpoint at ./llama3-pa/llama3-32kpa-emb-init-weight-tied and are newly initialized: ['lm_head.weight'] | |
### used - | |
``` | |
source_model.lm_head.weight.data = source_model.model.embed_tokens.weight.data | |
source_model.lm_head.weight = source_model.model.embed_tokens.weight | |
source_model.tie_weights() | |
``` |