saikatkumardey
commited on
Commit
·
bc214c2
1
Parent(s):
2645e4c
Update README.md
Browse files
README.md
CHANGED
@@ -10,13 +10,23 @@ It was quantized using [CTranslate2](https://opennmt.net/CTranslate2/guides/tran
|
|
10 |
ct2-transformers-converter --model MBZUAI/LaMini-Flan-T5-783M --output_dir lamini-flan-t5-783m-int8_float16 --quantization int8_float16
|
11 |
```
|
12 |
|
13 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
```python
|
16 |
import ctranslate2
|
17 |
import transformers
|
18 |
|
19 |
-
|
20 |
model_dir = "lamini-flan-t5-783m_int8_float16"
|
21 |
translator = ctranslate2.Translator(
|
22 |
model_dir, compute_type="auto", inter_threads=4, intra_threads=4
|
|
|
10 |
ct2-transformers-converter --model MBZUAI/LaMini-Flan-T5-783M --output_dir lamini-flan-t5-783m-int8_float16 --quantization int8_float16
|
11 |
```
|
12 |
|
13 |
+
# How to use it?
|
14 |
+
|
15 |
+
|
16 |
+
## Clone the model
|
17 |
+
|
18 |
+
```
|
19 |
+
git lfs install
|
20 |
+
git clone [email protected]:saikatkumardey/lamini-flan-t5-783m_int8_float16
|
21 |
+
```
|
22 |
+
|
23 |
+
## Code example
|
24 |
|
25 |
```python
|
26 |
import ctranslate2
|
27 |
import transformers
|
28 |
|
29 |
+
|
30 |
model_dir = "lamini-flan-t5-783m_int8_float16"
|
31 |
translator = ctranslate2.Translator(
|
32 |
model_dir, compute_type="auto", inter_threads=4, intra_threads=4
|