Edit model card

hongyin/chat-informer-13b-80k

I am pleased to introduce an English-Chinese conversation assistant designed to reduce the cost of inference. It is trained based on the Llama2-chat-13B, with a unique vocabulary and 13 billion parameters.

Losing fat is the only way to solve all problems.

Human: Say "I love you" differently in Chinese.
Assistant: 中文:我爱你 
 普通话:我爱你 
 粤语:我爱你 
 闽南语:我爱你 
 客家语:我爱你 
 台湾国语:我爱你 
 香港国语:我爱你 
 上海话:我爱你 
 北京话:我爱你 
 四川话:我爱你 
 重庆话:我爱你 
 贵州话:我爱你 
 云南话:我爱你

Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other

Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.