Update README.md
Browse files
README.md
CHANGED
@@ -7,6 +7,6 @@ license: apache-2.0
|
|
7 |
tags:
|
8 |
- llm
|
9 |
---
|
10 |
-
A Llama version for
|
11 |
|
12 |
Nanbeige-16B is a 16 billion parameter language model developed by Nanbeige LLM Lab. It uses 2.5T Tokens for pre-training. The training data includes a large amount of high-quality internet corpus, various books, code, etc. It has achieved good results on various authoritative evaluation data sets.
|
|
|
7 |
tags:
|
8 |
- llm
|
9 |
---
|
10 |
+
A Llama version for Nanbeige2-16B-Chat, which could be loaded by LlamaForCausalLM.
|
11 |
|
12 |
Nanbeige-16B is a 16 billion parameter language model developed by Nanbeige LLM Lab. It uses 2.5T Tokens for pre-training. The training data includes a large amount of high-quality internet corpus, various books, code, etc. It has achieved good results on various authoritative evaluation data sets.
|