ericsorides commited on
Commit
2c3ec72
1 Parent(s): 45d3a8b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - text-generation-inference
4
+ - whisper
5
+ - audio
6
+ base_model:
7
+ - openai/whisper-medium
8
+ ---
9
+
10
+
11
+ # Whisper medium with Key-Value-Cache enabled in ONNX fp16 format
12
+ - Model creator: [Open AI](https://huggingface.co/openai)
13
+ - Original model: [Whisper Medium](https://huggingface.co/openai/whisper-medium)
14
+
15
+ <!-- description start -->
16
+ ## Description
17
+
18
+ This repo contains the ONNX files for the ONNX conversion of Whisper Medium done by Esperanto Technologies.
19
+ The model is in the fp16 format and has the KVC enabled.
20
+
21
+ <!-- description end -->
22
+
23
+ ## How to download ONNX model and weight files
24
+
25
+ The easiest way to obtain the model is to clone this whole repo.
26
+ Alternatively you can download the files is using the `huggingface-hub` Python library.
27
+
28
+ ```shell
29
+ pip3 install huggingface-hub>=0.17.1
30
+ ```
31
+
32
+ Then you can download any individual model file to the current directory, at high speed, with a command like this:
33
+
34
+ ```shell
35
+ huggingface-cli download Esperanto/whisper-medium-kvc-fp16-onnx --local-dir whisper-medium-kvc-fp16-onnx --local-dir-use-symlinks False
36
+ ```
37
+
38
+ For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
39
+
40
+ ## How to run from Python code using ONNXRuntime
41
+
42
+ This model can easily be ran in a CPU using [ONNXRuntime](https://onnxruntime.ai/).
43
+
44
+ Scripts about how to run these models will be provided soon.