TurboPascal
commited on
Commit
·
965d854
1
Parent(s):
5365f6a
Update README.md
Browse files
README.md
CHANGED
@@ -8,9 +8,9 @@ pipeline_tag: text-generation
|
|
8 |
Llama-zh-base is an open-source project that offers a complete training pipeline for building Chinese large language models, ranging from dataset preparation to tokenization, pre-training, prompt tuning, and the reinforcement learning technique RLHF.
|
9 |
This is the Llama-zh-base model trained from scratch using the Chinese pretrain corpus in this project.The amount of parameters is about 0.8B.
|
10 |
|
11 |
-
使用
|
12 |
|
13 |
-
[Repo Links](https://github.com/enze5088/Chatterbox/blob/main/docs/model/llama-zh-base.md)
|
14 |
|
15 |
## 简介
|
16 |
|
|
|
8 |
Llama-zh-base is an open-source project that offers a complete training pipeline for building Chinese large language models, ranging from dataset preparation to tokenization, pre-training, prompt tuning, and the reinforcement learning technique RLHF.
|
9 |
This is the Llama-zh-base model trained from scratch using the Chinese pretrain corpus in this project.The amount of parameters is about 0.8B.
|
10 |
|
11 |
+
使用33G中文语料重头开始预训练的Llama模型,旨在提供可用的中小型基础模型。重新构建了embedding层和tokenizer。目前未经过指令微调。参数量约为0.8B左右。
|
12 |
|
13 |
+
项目github link [Repo Links](https://github.com/enze5088/Chatterbox/blob/main/docs/model/llama-zh-base.md)
|
14 |
|
15 |
## 简介
|
16 |
|