Update hf_benchmark_example.py
Browse files- hf_benchmark_example.py +12 -0
hf_benchmark_example.py
CHANGED
@@ -1,3 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
import json
|
2 |
|
3 |
import datasets
|
|
|
1 |
+
''''
|
2 |
+
cmd example
|
3 |
+
You need a file called "sample.txt" (default path) with text to take tokens for prompts or supply --text_file "path/to/text.txt" as an argument to a text file.
|
4 |
+
You can use our attached "sample.txt" file with one of Deci's blogs as a prompt.
|
5 |
+
|
6 |
+
# Run this and record tokens per second (652 tokens per second on A10 for DeciLM-6b)
|
7 |
+
python time_hf.py --model Deci/DeciLM-6b
|
8 |
+
|
9 |
+
# Run this and record tokens per second (136 tokens per second on A10 for meta-llama/Llama-2-7b-hf), CUDA OOM above batch size 8
|
10 |
+
python time_hf.py --model meta-llama/Llama-2-7b-hf --batch_size 8
|
11 |
+
''''
|
12 |
+
|
13 |
import json
|
14 |
|
15 |
import datasets
|