Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,23 @@
|
|
1 |
-
---
|
2 |
-
license: mit
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
---
|
4 |
+
|
5 |
+
# 简介 ChatGLM-6B Mirror
|
6 |
+
ChatGLM-6B 是一个开源的、支持中英双语的对话语言模型, 基于 General Language Model (GLM) 架构, 具有 62 亿参数。结合模型量化技术, 用户可以在消费级的显卡上进行本地部署(INT4 量化级别下最低只需 6GB 显存)。 ChatGLM-6B 使用了和 ChatGPT 相似的技术, 针对中文问答和对话进行了优化。经过约 1T 标识符的中英双语训练, 辅以监督微调、反馈自助、人类反馈强化学习等技术的加持, 62 亿参数的 ChatGLM-6B 已经能生成相当符合人类偏好的回答。
|
7 |
+
|
8 |
+
ChatGLM-6B is an open source, bilingual conversational language model based on the General Language Model (GLM) architecture with 6.2 billion parameters. Combined with model quantization techniques, it can be deployed locally on consumer-grade graphics cards (as low as 6GB of video memory at INT4 quantization level). ChatGLM-6B uses similar technology to ChatGPT, optimized for Chinese Q&A and conversation. With approximately 1T identifiers trained in both English and Chinese, and supported by supervised fine-tuning, feedback self-help, and human feedback reinforcement learning, ChatGLM-6B with 6.2 billion parameters is able to generate responses that are fairly consistent with human preferences.
|
9 |
+
|
10 |
+
## 使用 Usage
|
11 |
+
```python
|
12 |
+
from modelscope import snapshot_download
|
13 |
+
model_dir = snapshot_download('Genius-Society/chatglm_6b')
|
14 |
+
```
|
15 |
+
|
16 |
+
## 维护 Maintenance
|
17 |
+
```bash
|
18 |
+
git clone [email protected]:Genius-Society/chatglm_6b
|
19 |
+
cd chatglm_6b
|
20 |
+
```
|
21 |
+
|
22 |
+
## 引用 Reference
|
23 |
+
[1] <a href="https://www.modelscope.cn/models/ZhipuAI/ChatGLM-6B">ChatGLM-6B</a>
|