zeroshot commited on
Commit
731c7e7
·
1 Parent(s): 315e0bb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md CHANGED
@@ -1,3 +1,45 @@
1
  ---
2
  license: mit
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ language:
4
+ - en
5
  ---
6
+
7
+ # bge-micro-v2-quant
8
+
9
+ This is the quantized (INT8) ONNX variant of the [bge-micro-v2](https://huggingface.co/TaylorAI/bge-micro-v2) embeddings model created with [DeepSparse Optimum](https://github.com/neuralmagic/optimum-deepsparse) for ONNX export/inference and Neural Magic's [Sparsify](https://github.com/neuralmagic/sparsify) for one-shot quantization.
10
+
11
+ Current list of sparse and quantized bge ONNX models:
12
+
13
+ | Links | Sparsification Method |
14
+ | --------------------------------------------------------------------------------------------------- | ---------------------- |
15
+ | [zeroshot/bge-large-en-v1.5-sparse](https://huggingface.co/zeroshot/bge-large-en-v1.5-sparse) | Quantization (INT8) & 50% Pruning |
16
+ | [zeroshot/bge-large-en-v1.5-quant](https://huggingface.co/zeroshot/bge-large-en-v1.5-quant) | Quantization (INT8) |
17
+
18
+ ```bash
19
+ pip install -U deepsparse-nightly[sentence_transformers]
20
+ ```
21
+
22
+ ```python
23
+ from deepsparse.sentence_transformers import SentenceTransformer
24
+ model = SentenceTransformer('zeroshot/bge-small-en-v1.5-quant', export=False)
25
+
26
+ # Our sentences we like to encode
27
+ sentences = ['This framework generates embeddings for each input sentence',
28
+ 'Sentences are passed as a list of string.',
29
+ 'The quick brown fox jumps over the lazy dog.']
30
+
31
+ # Sentences are encoded by calling model.encode()
32
+ embeddings = model.encode(sentences)
33
+
34
+ # Print the embeddings
35
+ for sentence, embedding in zip(sentences, embeddings):
36
+ print("Sentence:", sentence)
37
+ print("Embedding:", embedding.shape)
38
+ print("")
39
+ ```
40
+
41
+ For further details regarding DeepSparse & Sentence Transformers integration, refer to the [DeepSparse README](https://github.com/neuralmagic/deepsparse/tree/main/src/deepsparse/sentence_transformers).
42
+
43
+ For general questions on these models and sparsification methods, reach out to the engineering team on our [community Slack](https://join.slack.com/t/discuss-neuralmagic/shared_invite/zt-q1a1cnvo-YBoICSIw3L1dmQpjBeDurQ).
44
+
45
+ ![;)](https://media.giphy.com/media/bYg33GbNbNIVzSrr84/giphy-downsized-large.gif)