Edit model card

hongyin/awareness-en-zh-0.8b-instruct

This is a English-Chinese bilingual autoregressive language model based on Bloom with a parameter size of 0.8b. The training process consists of two parts: (1) NTP task. (2) Instruction tuning. It is worth noting that although the model has learned many NLP downstream tasks (excluding multi-round dialogue), the model is not yet able to act like a chatbot. The model can be used as a raw material for alchemy.

Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other

Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.