🎉 新闻
- [2024-10-11] 新文速递|PreparedLLM:高效训练领域大语言模型的“前预训练”框架。
- [2024-08-31] 文章PreparedLLM: Effective Pre-pretraining Framework for Domain-specific Large Language Models已被Big Earth Data期刊接收。
- [2024-08-31] 发布Chinese-Mistral-7B-Instruct-v0.2对话模型。语言理解能力大幅提高,并且具备多轮对话能力。
- [2024-06-30] 发布JiuZhou-Instruct-v0.2对话模型。语言理解能力大幅提高,并且具备多轮对话能力。
- [2024-04-04] 发布Chinese-Mistral-7B-Instruct-v0.1。
- [2024-03-31] 发布JiuZhou-base基座模型。
JiuZhou, a powerful bilingual LLM with 7 billion parameters developed by the Tsinghua research team.
Use case: We are using a widely circulated and interesting question.
Qestion: 9.11 and 9.9 - which is bigger?
· JiuZhou answer correctly.
· ChatGPT, Gemini, Moonshot Al, Qianwen, Mixtral 8x7B, Llama 3 all answer incorrectly.