Update README.md
Browse files
README.md
CHANGED
@@ -1,13 +1,16 @@
|
|
1 |
---
|
2 |
license: cc-by-sa-4.0
|
|
|
|
|
3 |
---
|
4 |
|
5 |
# Description
|
6 |
|
7 |
This is a test model exported for my [test script](https://github.com/kazssym/stablelm-study-2).
|
8 |
-
It was exported from [stabilityai/stablelm-3b-4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t) to ONNX with a [modified](https://github.com/huggingface/optimum/pull/1719) Hugging Face Optimum
|
|
|
9 |
|
10 |
-
|
11 |
|
12 |
# Export command
|
13 |
|
@@ -16,6 +19,8 @@ This model was exported with the following command:
|
|
16 |
optimum-cli export onnx --model stabilityai/stablelm-3b-4e1t --trust-remote-code --device cpu --optimize O1 output/onnx-fp32/
|
17 |
```
|
18 |
|
|
|
|
|
19 |
# Output from Optimum CLI
|
20 |
|
21 |
```
|
|
|
1 |
---
|
2 |
license: cc-by-sa-4.0
|
3 |
+
tags:
|
4 |
+
- causal-lm
|
5 |
---
|
6 |
|
7 |
# Description
|
8 |
|
9 |
This is a test model exported for my [test script](https://github.com/kazssym/stablelm-study-2).
|
10 |
+
It was exported from [stabilityai/stablelm-3b-4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t) to ONNX with a [modified](https://github.com/huggingface/optimum/pull/1719) Hugging Face Optimum.
|
11 |
+
It is quite possible to have problems.
|
12 |
|
13 |
+
This model does not include a tokenizer.
|
14 |
|
15 |
# Export command
|
16 |
|
|
|
19 |
optimum-cli export onnx --model stabilityai/stablelm-3b-4e1t --trust-remote-code --device cpu --optimize O1 output/onnx-fp32/
|
20 |
```
|
21 |
|
22 |
+
It requires [Transformers](https://github.com/huggingface/transformers) 4.38 or later to export.
|
23 |
+
|
24 |
# Output from Optimum CLI
|
25 |
|
26 |
```
|