ozanarmagan
commited on
Commit
·
048df05
1
Parent(s):
e143fda
Update README.md
Browse files
README.md
CHANGED
@@ -1,17 +1,21 @@
|
|
1 |
## Typesense Public Embedding Models
|
2 |
-
We
|
3 |
|
4 |
### Convert a model to ONNX format
|
5 |
|
6 |
#### Converting a Hugging Face Transformers Model
|
7 |
-
|
8 |
#### Converting a PyTorch Model
|
9 |
-
|
10 |
#### Converting a Tensorflow Model
|
11 |
-
|
12 |
|
13 |
### Creating model config
|
14 |
-
Before
|
|
|
|
|
|
|
|
|
15 |
| Key | Description | Optional |
|
16 |
|-----|-------------|----------|
|
17 |
|model_md5| MD5 checksum of model file as string| No |
|
@@ -20,3 +24,8 @@ Before creating a PR with your ONNX model, you should store model file, vocab fi
|
|
20 |
|vocab_file_name| File name of vocab file| No |
|
21 |
|indexing_prefix| Prefix to be added before embedding documents| Yes |
|
22 |
|query_prefix| Prefix to be added before embedding queries | Yes |
|
|
|
|
|
|
|
|
|
|
|
|
1 |
## Typesense Public Embedding Models
|
2 |
+
We maintain a repository of currently supported embedding models, and we welcome contributions from the community. If you have a model that you would like to add to our supported list, you can convert it to the ONNX format and create a Pull Request (PR) to include it.
|
3 |
|
4 |
### Convert a model to ONNX format
|
5 |
|
6 |
#### Converting a Hugging Face Transformers Model
|
7 |
+
To convert any model from Hugging Face to ONNX format, you can follow the instructions in [this link](https://huggingface.co/docs/transformers/serialization#export-to-onnx) using the ```optimum-cli```.
|
8 |
#### Converting a PyTorch Model
|
9 |
+
If you have a PyTorch model, you can use the ```torch.onnx``` APIs to convert it to the ONNX format. More information on the conversion process can be found [here](https://pytorch.org/docs/stable/onnx.html).
|
10 |
#### Converting a Tensorflow Model
|
11 |
+
For Tensorflow models, you can utilize the tf2onnx tool to convert them to the ONNX format. Detailed guidance on this conversion can be found [here](https://onnxruntime.ai/docs/tutorials/tf-get-started.html#getting-started-converting-tensorflow-to-onnx).
|
12 |
|
13 |
### Creating model config
|
14 |
+
Before submitting your ONNX model through a PR, you need to organize the necessary files under a folder with the model's name. Ensure that your model configuration adheres to the following structure:
|
15 |
+
|
16 |
+
- **Model File**: The ONNX model file.
|
17 |
+
- **Vocab File**: The vocabulary file required for the model.
|
18 |
+
- **Model Config File**: Named as config.json, this file should contain the following keys:
|
19 |
| Key | Description | Optional |
|
20 |
|-----|-------------|----------|
|
21 |
|model_md5| MD5 checksum of model file as string| No |
|
|
|
24 |
|vocab_file_name| File name of vocab file| No |
|
25 |
|indexing_prefix| Prefix to be added before embedding documents| Yes |
|
26 |
|query_prefix| Prefix to be added before embedding queries | Yes |
|
27 |
+
|
28 |
+
|
29 |
+
Please make sure that the information in the configuration file is accurate and complete before submitting your PR.
|
30 |
+
|
31 |
+
We appreciate your contributions to expand our collection of supported embedding models!
|