Update README.md
Browse files
README.md
CHANGED
@@ -1,6 +1,13 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
JiuZhou, a powerful bilingual LLM with 7 billion parameters developed by the Tsinghua research team.<br>
|
5 |
Use case: We are using a widely circulated and interesting question.<br>
|
6 |
Qestion: 9.11 and 9.9 - which is bigger?<br>
|
|
|
1 |
+
## 🎉 新闻
|
2 |
+
- [2024-10-11] [新文速递|PreparedLLM:高效训练领域大语言模型的“前预训练”框架](https://mp.weixin.qq.com/s/ugJQ9tbp6Y87xA3TOWteqw)。
|
3 |
+
- [2024-08-31] 文章[PreparedLLM: Effective Pre-pretraining Framework for Domain-specific Large Language Models](https://www.tandfonline.com/doi/full/10.1080/20964471.2024.2396159)已被*Big Earth Data*期刊接收。
|
4 |
+
- [2024-08-31] 发布[Chinese-Mistral-7B-Instruct-v0.2](https://huggingface.co/itpossible/Chinese-Mistral-7B-Instruct-v0.2)对话模型。语言理解能力大幅提高,并且具备多轮对话能力。
|
5 |
+
- [2024-06-30] 发布[JiuZhou-Instruct-v0.2](https://huggingface.co/itpossible/JiuZhou-Instruct-v0.2)对话模型。语言理解能力大幅提高,并且具备多轮对话能力。
|
6 |
+
- [2024-04-04] 发布[Chinese-Mistral-7B-Instruct-v0.1](https://huggingface.co/itpossible/Chinese-Mistral-7B-Instruct-v0.1)。
|
7 |
+
- [2024-03-31] 发布[Chinese-Mistral-7B-v0.1](https://huggingface.co/itpossible/Chinese-Mistral-7B)基座模型。
|
8 |
+
|
9 |
+
|
10 |
+
|
11 |
JiuZhou, a powerful bilingual LLM with 7 billion parameters developed by the Tsinghua research team.<br>
|
12 |
Use case: We are using a widely circulated and interesting question.<br>
|
13 |
Qestion: 9.11 and 9.9 - which is bigger?<br>
|