File size: 762 Bytes
9b52ab2 640b207 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
---
license: apache-2.0
---
# Model Description
Erya4FT is based on Erya and further fine-tuned on our dataset[RUCAIBox/Erya-dataset · Datasets at Hugging Face](https://huggingface.co/datasets/RUCAIBox/Erya-dataset), enhancing the ability to translate ancient Chinese into Modern Chinese.
# Example
```python
from transformers import BertTokenizer, CPTForConditionalGeneration
tokenizer = BertTokenizer.from_pretrained("RUCAIBox/Erya")
model = CPTForConditionalGeneration.from_pretrained("RUCAIBox/Erya4FT")
input_ids = tokenizer("安世字子孺,少以父任为郎。", return_tensors='pt')
input_ids.pop("token_type_ids")
pred_ids = model.generate(max_new_tokens=256, **input_ids)
print(tokenizer.batch_decode(pred_ids, skip_special_tokens=True))
``` |