Upload folder using huggingface_hub
Browse files- README.md +1 -3
- config_sentence_transformers.json +1 -1
README.md
CHANGED
@@ -1694,9 +1694,7 @@ model-index:
|
|
1694 |
---
|
1695 |
|
1696 |
# Introduction
|
1697 |
-
**This model is built upon [BAAI/bge-en-icl](https://huggingface.co/BAAI/bge-en-icl). The key
|
1698 |
-
- The inclusion of a default prompt name in the model configuration file. This adjustment is particularly useful when serving the model, as the default prompt will automatically be prepended to each incoming document.
|
1699 |
-
- Make the last-token pooling for sentence transformers working out-of-box, which is another optimization for model serving. - https://huggingface.co/BAAI/bge-en-icl/discussions/10 @michaelfeil
|
1700 |
|
1701 |
|
1702 |
<h1 align="center">FlagEmbedding</h1>
|
|
|
1694 |
---
|
1695 |
|
1696 |
# Introduction
|
1697 |
+
**This model is built upon [BAAI/bge-en-icl](https://huggingface.co/BAAI/bge-en-icl). The key difference from the original model is the inclusion of a default prompt name in the model configuration file. This adjustment is particularly useful when serving the model, as the default prompt will automatically be prepended to each incoming document.**
|
|
|
|
|
1698 |
|
1699 |
|
1700 |
<h1 align="center">FlagEmbedding</h1>
|
config_sentence_transformers.json
CHANGED
@@ -7,5 +7,5 @@
|
|
7 |
"prompts": {
|
8 |
"query": "<instruct>Given a web search query, retrieve relevant passages that answer the query.\n<query>"
|
9 |
},
|
10 |
-
"default_prompt_name":
|
11 |
}
|
|
|
7 |
"prompts": {
|
8 |
"query": "<instruct>Given a web search query, retrieve relevant passages that answer the query.\n<query>"
|
9 |
},
|
10 |
+
"default_prompt_name": null
|
11 |
}
|