maxwellyu commited on
Commit
a9eda3a
·
verified ·
1 Parent(s): 87598a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -6,7 +6,7 @@ library_name: transformers
6
 
7
  license: other
8
  license_name: tencent-license
9
- license_link: https://huggingface.co/tencent/Hunyuan-7B-Instruct/blob/main/LICENSE.txt
10
  ---
11
 
12
  <p align="center">
@@ -14,12 +14,12 @@ license_link: https://huggingface.co/tencent/Hunyuan-7B-Instruct/blob/main/LICEN
14
  </p><p></p>
15
 
16
  <p align="center">
17
- &nbsp<a href="https://github.com/Tencent/Tencent-Hunyuan-7B"><b>GITHUB</b></a>&nbsp&nbsp
18
 
19
 
20
  ## Model Introduction
21
 
22
- The 7B models released by Hunyuan this time: [Hunyuan-7B-Pretrain](https://huggingface.co/tencent/Hunyuan-7B-Pretrain) and [Hunyuan-7B-Instruct](https://huggingface.co/tencent/Hunyuan-7B-Instruct) , use better data allocation and training, have strong performance, and have achieved a good balance between computing and performance. It stands out from many large-scale language models and is currently one of the strongest Chinese 7B Dense models.
23
 
24
  ### Introduction to Technical Advantages
25
 
@@ -36,7 +36,7 @@ The 7B models released by Hunyuan this time: [Hunyuan-7B-Pretrain](https://huggi
36
  &nbsp;
37
 
38
  ## Related News
39
- * 2025.1.24 We have open-sourced **Hunyuan-7B-Pretrain** , **Hunyuan-7B-Instruct** on Hugging Face.
40
  <br>
41
 
42
 
 
6
 
7
  license: other
8
  license_name: tencent-license
9
+ license_link: https://huggingface.co/tencent/Hunyuan-7B-Instruct-0124/blob/main/LICENSE.txt
10
  ---
11
 
12
  <p align="center">
 
14
  </p><p></p>
15
 
16
  <p align="center">
17
+ &nbsp<a href="https://github.com/Tencent/Tencent-Hunyuan-7B-0124"><b>GITHUB</b></a>&nbsp&nbsp
18
 
19
 
20
  ## Model Introduction
21
 
22
+ The 7B models released by Hunyuan this time: [Hunyuan-7B-Pretrain](https://huggingface.co/tencent/Hunyuan-7B-Pretrain-0124) and [Hunyuan-7B-Instruct](https://huggingface.co/tencent/Hunyuan-7B-Instruct-0124) , use better data allocation and training, have strong performance, and have achieved a good balance between computing and performance. It stands out from many large-scale language models and is currently one of the strongest Chinese 7B Dense models.
23
 
24
  ### Introduction to Technical Advantages
25
 
 
36
  &nbsp;
37
 
38
  ## Related News
39
+ * 2025.1.24 We have open-sourced **Hunyuan-7B-Pretrain-0124** , **Hunyuan-7B-Instruct-0124** on Hugging Face.
40
  <br>
41
 
42