Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,93 @@
|
|
1 |
---
|
2 |
license: bsd-2-clause
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: bsd-2-clause
|
3 |
---
|
4 |
+
# Utilizing Custom ONNX Models Stored in Hugging Face within HSSM
|
5 |
+
|
6 |
+
This guide will walk you through the process of using custom ONNX models stored in Hugging Face within HSSM (Hierarchical State Space Model) framework.
|
7 |
+
|
8 |
+
## Prerequisites
|
9 |
+
|
10 |
+
1. Python 3.8 or later.
|
11 |
+
2. HSSM library installed in your Python environment.
|
12 |
+
3. A pre-trained ONNX model stored on Hugging Face model hub.
|
13 |
+
|
14 |
+
## Step-by-step guide
|
15 |
+
|
16 |
+
### Step 1: Import necessary libraries
|
17 |
+
```
|
18 |
+
import pandas as pd
|
19 |
+
import hssm
|
20 |
+
import ssms.basic_simulators
|
21 |
+
|
22 |
+
pytensor.config.floatX = "float32"
|
23 |
+
```
|
24 |
+
### Step 2: Define HSSM Configuration
|
25 |
+
|
26 |
+
You will have to define the configuration of your model. Make sure you are defining the log-likelihood kind as "approx_differentiable" and providing the Hugging Face model name in the loglik field.
|
27 |
+
|
28 |
+
```
|
29 |
+
my_hssm = hssm.HSSM(
|
30 |
+
data=dataset_lan,
|
31 |
+
loglik_kind = "approx_differentiable",
|
32 |
+
loglik = "levy.onnx",
|
33 |
+
model="custom",
|
34 |
+
model_config= {
|
35 |
+
"backend": "jax",
|
36 |
+
"list_params": ["v", "a", "z", "alpha", "t"],
|
37 |
+
"bounds": {
|
38 |
+
"v": (-3.0, 3.0),
|
39 |
+
"a": (0.3, 3.0),
|
40 |
+
"z": (0.1, 0.9),
|
41 |
+
"alpha": (1.0, 2.0),
|
42 |
+
"t": (1e-3, 2.0),
|
43 |
+
},
|
44 |
+
}
|
45 |
+
)
|
46 |
+
```
|
47 |
+
This creates an HSSM object my_hssm using the custom ONNX model levy.onnx from the Hugging Face repository.
|
48 |
+
|
49 |
+
```
|
50 |
+
my_hssm.sample(cores=2, draws=500, tune=500, mp_ctx="forkserver")
|
51 |
+
```
|
52 |
+
|
53 |
+
# Uploading ONNX Files to a Hugging Face Repository
|
54 |
+
|
55 |
+
If your ONNX file is not currently housed in your Hugging Face repository, you can include it by adhering to the steps delineated below:
|
56 |
+
|
57 |
+
1. Import the HfApi module from huggingface_hub:
|
58 |
+
|
59 |
+
```
|
60 |
+
from huggingface_hub import HfApi
|
61 |
+
```
|
62 |
+
|
63 |
+
2. Upload the ONNX file using the upload_file method:
|
64 |
+
|
65 |
+
```
|
66 |
+
api = HfApi()
|
67 |
+
api.upload_file(
|
68 |
+
path_or_fileobj="test.onnx",
|
69 |
+
path_in_repo="test.onnx",
|
70 |
+
repo_id="franklab/HSSM",
|
71 |
+
repo_type="model",
|
72 |
+
create_pr=True,
|
73 |
+
)
|
74 |
+
```
|
75 |
+
The execution of these steps will generate a Pull Request (PR) on Hugging Face, which will subsequently be evaluated by a member of our team.
|
76 |
+
|
77 |
+
## Creating a Pull Request and a New ONNX Model
|
78 |
+
|
79 |
+
1. **Creating a Pull Request on Hugging Face**
|
80 |
+
|
81 |
+
Navigate to the following link: [Hugging Face PR](https://huggingface.co/franklab/HSSM/blob/refs%2Fpr%2F1/test.onnx)
|
82 |
+
|
83 |
+
By doing so, you will **generate a Pull Request on Hugging Face**, which will be reviewed by our team members.
|
84 |
+
|
85 |
+
2. **Creating a Custom ONNX Model**
|
86 |
+
|
87 |
+
### Establish a Network Config and State Dictionary Files in PyTorch
|
88 |
+
|
89 |
+
To construct a custom model and save it as an ONNX file, you must create a network configuration file and a state dictionary file in PyTorch. Refer to the instructions outlined in the README of the [LANFactory package](LINK_TO_LANFACTORY_PACKAGE).
|
90 |
+
|
91 |
+
### Convert Network Config and State Dictionary Files to ONNX
|
92 |
+
|
93 |
+
Once you've generated the network configuration and state dictionary files, you will need to **convert these files into an ONNX format**.
|