Kaguya-19 commited on
Commit
6b4f3f1
1 Parent(s): 8c02c7a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +27 -21
README.md CHANGED
@@ -1,28 +1,34 @@
1
- ## MiniCPM-R
2
-
3
- **MiniCPM-R** 是面壁智能与清华大学自然语言处理实验室(THUNLP)共同开发的中英双语言文本嵌入模型,有如下特点:
 
 
 
 
 
 
4
  - 出色的中文、英文检索能力。
5
  - 出色的中英跨语言检索能力。
6
 
7
- MiniCPM-R 基于 [MiniCPM-2B-sft-bf16](https://huggingface.co/openbmb/MiniCPM-2B-sft-bf16) 训练,结构上采取双向注意力和 Weighted Mean Pooling [1]。采取多阶段训练方式,共使用包括开源数据、机造数据、闭源数据在内的约 600 万条训练数据。
8
 
9
  欢迎关注 RAG 套件系列:
10
 
11
- - 检索模型:[MiniCPM-R](https://huggingface.co/openbmb/MiniCPM-R)
12
- - 重排模型:[MiniCPM-RR](https://huggingface.co/openbmb/MiniCPM-RR)
13
  - 面向 RAG 场景的 LoRA 插件:[MiniCPM3-RAG-LoRA](https://huggingface.co/openbmb/MiniCPM3-RAG-LoRA)
14
 
15
- **MiniCPM-R** is a bilingual & cross-lingual text embedding model developed by ModelBest Inc. and THUNLP, featuring:
16
 
17
  - Exceptional Chinese and English retrieval capabilities.
18
  - Outstanding cross-lingual retrieval capabilities between Chinese and English.
19
 
20
- MiniCPM-R is trained based on [MiniCPM-2B-sft-bf16](https://huggingface.co/openbmb/MiniCPM-2B-sft-bf16) and incorporates bidirectional attention and Weighted Mean Pooling [1] in its architecture. The model underwent multi-stage training using approximately 6 million training examples, including open-source, synthetic, and proprietary data.
21
 
22
  We also invite you to explore the RAG toolkit series:
23
 
24
- - Retrieval Model: [MiniCPM-R](https://huggingface.co/openbmb/MiniCPM-R)
25
- - Re-ranking Model: [MiniCPM-RR](https://huggingface.co/openbmb/MiniCPM-RR)
26
  - LoRA Plugin for RAG scenarios: [MiniCPM3-RAG-LoRA](https://huggingface.co/openbmb/MiniCPM3-RAG-LoRA)
27
 
28
  [1] Muennighoff, N. (2022). Sgpt: Gpt sentence embeddings for semantic search. arXiv preprint arXiv:2202.08904.
@@ -42,7 +48,7 @@ We also invite you to explore the RAG toolkit series:
42
 
43
  本模型支持 query 侧指令,格式如下:
44
 
45
- MiniCPM-R supports query-side instructions in the following format:
46
 
47
  ```
48
  Instruction: {{ instruction }} Query: {{ query }}
@@ -62,7 +68,7 @@ Instruction: Given a claim about climate change, retrieve documents that support
62
 
63
  也可以不提供指令,即采取如下格式:
64
 
65
- MiniCPM-R also works in instruction-free mode in the following format:
66
 
67
  ```
68
  Query: {{ query }}
@@ -87,7 +93,7 @@ from transformers import AutoModel, AutoTokenizer
87
  import torch
88
  import torch.nn.functional as F
89
 
90
- model_name = "openbmb/MiniCPM-R"
91
  tokenizer = AutoTokenizer.from_pretrained(model_name)
92
  model = AutoModel.from_pretrained(model_name, trust_remote_code=True, attn_implementation="flash_attention_2", torch_dtype=torch.float16).to("cuda")
93
  model.eval()
@@ -145,8 +151,8 @@ print(scores.tolist()) # [[0.3535913825035095, 0.18596848845481873]]
145
  | gte-Qwen2-1.5B-instruct | 71.86 | 58.29 |
146
  | gte-Qwen2-7B-instruct | 76.03 | 60.25 |
147
  | bge-multilingual-gemma2 | 73.73 | 59.24 |
148
- | MiniCPM-R | **76.76** | 58.56 |
149
- | MiniCPM-R+MiniCPM-RR | 77.08 | 61.61 |
150
 
151
  ### 中英跨语言检索结果 CN-EN Cross-lingual Retrieval Results
152
 
@@ -157,15 +163,15 @@ print(scores.tolist()) # [[0.3535913825035095, 0.18596848845481873]]
157
  | gte-multilingual-base(Dense) | 68.2 | 39.46 | 45.86 |
158
  | gte-Qwen2-1.5B-instruct | 68.52 | 49.11 | 45.05 |
159
  | gte-Qwen2-7B-instruct | 68.27 | 49.14 | 49.6 |
160
- | MiniCPM-R | **72.95** | **52.65** | **49.95** |
161
- | MiniCPM-R+MiniCPM-RR | 74.33 | 53.21 | 54.12 |
162
 
163
  ## 许可证 License
164
 
165
  - 本仓库中代码依照 [Apache-2.0 协议](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE)开源。
166
- - MiniCPM-R 模型权重的使用则需要遵循 [MiniCPM 模型协议](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md)。
167
- - MiniCPM-R 模型权重对学术研究完全开放。如需将模型用于商业用途,请填写[此问卷](https://modelbest.feishu.cn/share/base/form/shrcnpV5ZT9EJ6xYjh3Kx0J6v8g)。
168
 
169
  * The code in this repo is released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
170
- * The usage of MiniCPM-R model weights must strictly follow [MiniCPM Model License.md](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md).
171
- * The models and weights of MiniCPM-R are completely free for academic research. After filling out a ["questionnaire"](https://modelbest.feishu.cn/share/base/form/shrcnpV5ZT9EJ6xYjh3Kx0J6v8g) for registration, MiniCPM-R weights are also available for free commercial use.
 
1
+ ---
2
+ language:
3
+ - zh
4
+ - en
5
+ base_model: openbmb/MiniCPM-2B-sft-bf16
6
+ ---
7
+ ## RankCPM-E
8
+
9
+ **RankCPM-E** 是面壁智能与清华大学自然语言处理实验室(THUNLP)共同开发的中英双语言文本嵌入模型,有如下特点:
10
  - 出色的中文、英文检索能力。
11
  - 出色的中英跨语言检索能力。
12
 
13
+ RankCPM-E 基于 [MiniCPM-2B-sft-bf16](https://huggingface.co/openbmb/MiniCPM-2B-sft-bf16) 训练,结构上采取双向注意力和 Weighted Mean Pooling [1]。采取多阶段训练方式,共使用包括开源数据、机造数据、闭源数据在内的约 600 万条训练数据。
14
 
15
  欢迎关注 RAG 套件系列:
16
 
17
+ - 检索模型:[RankCPM-E](https://huggingface.co/openbmb/RankCPM-E)
18
+ - 重排模型:[RankCPM-R](https://huggingface.co/openbmb/RankCPM-R)
19
  - 面向 RAG 场景的 LoRA 插件:[MiniCPM3-RAG-LoRA](https://huggingface.co/openbmb/MiniCPM3-RAG-LoRA)
20
 
21
+ **RankCPM-E** is a bilingual & cross-lingual text embedding model developed by ModelBest Inc. and THUNLP, featuring:
22
 
23
  - Exceptional Chinese and English retrieval capabilities.
24
  - Outstanding cross-lingual retrieval capabilities between Chinese and English.
25
 
26
+ RankCPM-E is trained based on [MiniCPM-2B-sft-bf16](https://huggingface.co/openbmb/MiniCPM-2B-sft-bf16) and incorporates bidirectional attention and Weighted Mean Pooling [1] in its architecture. The model underwent multi-stage training using approximately 6 million training examples, including open-source, synthetic, and proprietary data.
27
 
28
  We also invite you to explore the RAG toolkit series:
29
 
30
+ - Retrieval Model: [RankCPM-E](https://huggingface.co/openbmb/RankCPM-E)
31
+ - Re-ranking Model: [RankCPM-R](https://huggingface.co/openbmb/RankCPM-R)
32
  - LoRA Plugin for RAG scenarios: [MiniCPM3-RAG-LoRA](https://huggingface.co/openbmb/MiniCPM3-RAG-LoRA)
33
 
34
  [1] Muennighoff, N. (2022). Sgpt: Gpt sentence embeddings for semantic search. arXiv preprint arXiv:2202.08904.
 
48
 
49
  本模型支持 query 侧指令,格式如下:
50
 
51
+ RankCPM-E supports query-side instructions in the following format:
52
 
53
  ```
54
  Instruction: {{ instruction }} Query: {{ query }}
 
68
 
69
  也可以不提供指令,即采取如下格式:
70
 
71
+ RankCPM-E also works in instruction-free mode in the following format:
72
 
73
  ```
74
  Query: {{ query }}
 
93
  import torch
94
  import torch.nn.functional as F
95
 
96
+ model_name = "openbmb/RankCPM-E"
97
  tokenizer = AutoTokenizer.from_pretrained(model_name)
98
  model = AutoModel.from_pretrained(model_name, trust_remote_code=True, attn_implementation="flash_attention_2", torch_dtype=torch.float16).to("cuda")
99
  model.eval()
 
151
  | gte-Qwen2-1.5B-instruct | 71.86 | 58.29 |
152
  | gte-Qwen2-7B-instruct | 76.03 | 60.25 |
153
  | bge-multilingual-gemma2 | 73.73 | 59.24 |
154
+ | RankCPM-E | **76.76** | 58.56 |
155
+ | RankCPM-E+RankCPM-R | 77.08 | 61.61 |
156
 
157
  ### 中英跨语言检索结果 CN-EN Cross-lingual Retrieval Results
158
 
 
163
  | gte-multilingual-base(Dense) | 68.2 | 39.46 | 45.86 |
164
  | gte-Qwen2-1.5B-instruct | 68.52 | 49.11 | 45.05 |
165
  | gte-Qwen2-7B-instruct | 68.27 | 49.14 | 49.6 |
166
+ | RankCPM-E | **72.95** | **52.65** | **49.95** |
167
+ | RankCPM-E+RankCPM-R | 74.33 | 53.21 | 54.12 |
168
 
169
  ## 许可证 License
170
 
171
  - 本仓库中代码依照 [Apache-2.0 协议](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE)开源。
172
+ - RankCPM-E 模型权重的使用则需要遵循 [MiniCPM 模型协议](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md)。
173
+ - RankCPM-E 模型权重对学术研究完全开放。如需将模型用于商业用途,请填写[此问卷](https://modelbest.feishu.cn/share/base/form/shrcnpV5ZT9EJ6xYjh3Kx0J6v8g)。
174
 
175
  * The code in this repo is released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
176
+ * The usage of RankCPM-E model weights must strictly follow [MiniCPM Model License.md](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md).
177
+ * The models and weights of RankCPM-E are completely free for academic research. After filling out a ["questionnaire"](https://modelbest.feishu.cn/share/base/form/shrcnpV5ZT9EJ6xYjh3Kx0J6v8g) for registration, RankCPM-E weights are also available for free commercial use.