Commit
·
44707b3
1
Parent(s):
5b0b434
Update README.md
Browse files
README.md
CHANGED
@@ -4,5 +4,32 @@ license: afl-3.0
|
|
4 |
|
5 |
|
6 |
## Model description
|
7 |
-
|
8 |
-
test set.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
|
5 |
|
6 |
## Model description
|
7 |
+
MathGLM-10B is finetuned from GLM-10B on a dataset with additional multi-step arithmetic operations and math problems described in text, achieves similar performance to GPT-4 on a 5,000-samples Chinese math problem
|
8 |
+
test set.
|
9 |
+
|
10 |
+
|
11 |
+
## How to use
|
12 |
+
```python
|
13 |
+
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
14 |
+
tokenizer = AutoTokenizer.from_pretrained("BAAI/glm-10b-chinese", trust_remote_code=True)
|
15 |
+
model = AutoModelForSeq2SeqLM.from_pretrained("BAAI/glm-10b-chinese", trust_remote_code=True)
|
16 |
+
model = model.half().cuda()
|
17 |
+
|
18 |
+
inputs = tokenizer("凯旋门位于意大利米兰市古城堡旁。1807年为纪念[MASK]而建,门高25米,顶上矗立两武士青铜古兵车铸像。", return_tensors="pt")
|
19 |
+
inputs = tokenizer.build_inputs_for_generation(inputs, max_gen_length=512)
|
20 |
+
inputs = {key: value.cuda() for key, value in inputs.items()}
|
21 |
+
outputs = model.generate(**inputs, max_length=512, eos_token_id=tokenizer.eop_token_id)
|
22 |
+
print(tokenizer.decode(outputs[0].tolist()))
|
23 |
+
```
|
24 |
+
|
25 |
+
## Citation
|
26 |
+
Please cite our paper if you find this code useful for your research:
|
27 |
+
```
|
28 |
+
@article{yang2023gpt,
|
29 |
+
title={GPT Can Solve Mathematical Problems Without a Calculator},
|
30 |
+
author={Yang, Zhen and Ding, Ming and Lv, Qingsong and Jiang, Zhihuan and He, Zehai and Guo, Yuyi and Bai, Jinfeng and Tang, Jie},
|
31 |
+
journal={arXiv preprint arXiv:2309.03241},
|
32 |
+
year={2023}
|
33 |
+
}
|
34 |
+
```
|
35 |
+
|