File size: 1,072 Bytes
8cfe213 197f91d 01416d8 197f91d 01416d8 bcef6d7 e7a6269 01416d8 1a1ecfe 01416d8 e7a6269 1a1ecfe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
license: mit
---
Original source of the GGUF file: [juanjgit's Orca Mini 3B GGUF Model Card](https://huggingface.co/juanjgit/orca_mini_3B-GGUF)
Original model: [Pankaj Mathur's Orca Mini 3B.](https://huggingface.co/psmathur/orca_mini_3b)
# Prompt format
```
prompt = f"### System:\n{system}\n\n### User:\n{instruction}\n\n### Response:\n"
```
# Usage
Ensure that the [crtansformers](https://pypi.org/project/ctransformers/) Python Library is installed:
```
pip install ctransformers
```
Here is a self-contained Python script to get you started:
```
from ctransformers import AutoModelForCausalLM
llm = AutoModelForCausalLM.from_pretrained(
"zoltanctoth/orca_mini_3B-GGUF", model_file="orca-mini-3b.q4_0.gguf"
)
system = "You are an AI assistant that follows instruction extremely well. Help as much as you can. Give short answers."
instruction = "Which is the biggest city of India?"
prompt = f"### System:\n{system}\n\n### User:\n{instruction}\n\n### Response:\n"
for i in llm(prompt, stream=True):
print(i, end="", flush=True)
print()
```
|