Commit History
Add gmask token id
36b7f2d
Update slim checkpoint
6461061
Update code for slim
63ce1ba
Drop icetk dependency
72985e8
Fix decode method for torch tensor
23ad39b
Fix position ids expand
f82b180
Fix generate
fb23542
Fix attention mask for prefix prompt
08bc851
No padding for chat function
4b7ffbf
Fix attention_mask and position_ids
373fd6b
Fix encode method
e22cddf
Fix batch input
e1494f2
Implement batch generation
cc96a22
Fix position id for training
11c270c
Add support for loading quantized model
2e1be30
Use dynamic dtype for prompts
c949d03
Fix backward for quantization
0cfae21
Implement gradient checkpointing
aea6cef
Fix bugs
0564795
Add pad_token_id in config.json
2200e2b
Change padding side
db22499
Set ignore_index for CrossEntropyLoss
5c64357
Support batch training
8127ab6
Merge branch 'main' into dev_pt
fbda120
Add p-tuning v2
812f43f
Fix context length in get_position_ids
096f3de
Close CPU fusion on Mac
4a9b711
Fix Chinese punctuation
d2bbc82
Add English
2449bdc
Fix typo in tokenization_chatglm.py
1b54948
songxxzp
commited on