File size: 475 Bytes
b52a9ff
 
 
 
b8c0b52
28a0492
b52a9ff
b8c0b52
 
28a0492
 
b8c0b52
 
1
2
3
4
5
6
7
8
9
10
11
12
13
---
library_name: transformers
tags: []
---
### Input, output embedding weights are possibly not tied - 
Some weights of LlamaForCausalLM were not initialized from the model checkpoint at ./llama3-pa/llama3-32kpa-emb-init-weight-tied and are newly initialized: ['lm_head.weight']

### used - 
```
source_model.lm_head.weight.data = source_model.model.embed_tokens.weight.data
source_model.lm_head.weight = source_model.model.embed_tokens.weight
source_model.tie_weights()
```