Update README.md
#8
by
Sushentsev
- opened
README.md
CHANGED
@@ -42,7 +42,7 @@ These pairs were obtained from various domains and were carefully selected throu
|
|
42 |
The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
|
43 |
This makes our model useful for a range of use cases, especially when processing long documents is needed, including technical question answering and code search.
|
44 |
|
45 |
-
This model has
|
46 |
Additionally, we provide the following embedding models:
|
47 |
|
48 |
- [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters.
|
|
|
42 |
The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
|
43 |
This makes our model useful for a range of use cases, especially when processing long documents is needed, including technical question answering and code search.
|
44 |
|
45 |
+
This model has 161 million parameters, which enables fast and memory efficient inference, while delivering impressive performance.
|
46 |
Additionally, we provide the following embedding models:
|
47 |
|
48 |
- [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters.
|