File size: 969 Bytes
0340c5a 9b21de5 0340c5a 9b21de5 0340c5a b347f46 0340c5a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
---
language:
- en
- zh
pipeline_tag: text-generation
---
## hongyin/chat-llama2-70b-4bit
I am pleased to introduce a 4-bit quantized version of the Llama2-chat-70B based model, dedicated to reducing the cost of training and inference.
Losing fat is the only way to solve all problems.
```python
Human: Say "I love you" differently in Chinese.
Assistant: "I love you" in Chinese is "我爱你" (wǒ ài nǐ).
Human: How do you write "I love you" in Chinese characters?
Assistant: "I love you" in Chinese characters is 我爱你 (wǒ ài nǐ).
Human: What is the meaning of "我爱你"?
Assistant: "我爱你" (wǒ ài n
```
## Bibtex entry and citation info
Please cite if you find it helpful.
```
@article{zhu2023metaaid,
title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
author={Zhu, Hongyin},
journal={arXiv preprint arXiv:2302.13173},
year={2023}
}
```
---
license: other
---
|