metadata
license: apache-2.0
Model Description
Erya4FT is based on Erya and further fine-tuned on our datasetRUCAIBox/Erya-dataset · Datasets at Hugging Face, enhancing the ability to translate ancient Chinese into Modern Chinese.
Example
from transformers import BertTokenizer, CPTForConditionalGeneration
tokenizer = BertTokenizer.from_pretrained("RUCAIBox/Erya")
model = CPTForConditionalGeneration.from_pretrained("RUCAIBox/Erya4FT")
input_ids = tokenizer("安世字子孺,少以父任为郎。", return_tensors='pt')
input_ids.pop("token_type_ids")
pred_ids = model.generate(max_new_tokens=256, **input_ids)
print(tokenizer.batch_decode(pred_ids, skip_special_tokens=True))