--- license: apache-2.0 language: - zh pipeline_tag: text-generation tags: - not-for-all-audiences --- 数据来源自[h-corpus-2023](https://huggingface.co/datasets/a686d380/h-corpus-2023) 模型来源自[Qwen1.5-0.5B](https://huggingface.co/Qwen/Qwen1.5-0.5B) 本模型仅用于学术研究或个人学习,严禁商用。 ``` from transformers import AutoModelForCausalLM, AutoTokenizer repo_id = 'Monor/Qwen1.5-0.5B-h-world' tokenizer = AutoTokenizer.from_pretrained(repo_id, trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained(repo_id, device_map="auto", trust_remote_code=True).eval() inputs = tokenizer('小青突然来到了我的房间', return_tensors='pt') inputs = inputs.to(model.device) pred = model.generate(**inputs, max_new_tokens=32, num_beams=4, do_sample=True) print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True)) 输出:小青突然来到了我的房间,吓了我一跳。她见我睡着了,就脱了衣服钻进了被窝里,然后搂着我睡着了。 ```