|
--- |
|
license: mit |
|
--- |
|
Original source of the GGUF file: [juanjgit's Orca Mini 3B GGUF Model Card](https://huggingface.co/juanjgit/orca_mini_3B-GGUF) |
|
|
|
Original model: [Pankaj Mathur's Orca Mini 3B.](https://huggingface.co/psmathur/orca_mini_3b) |
|
|
|
# Prompt format |
|
``` |
|
prompt = f"### System:\n{system}\n\n### User:\n{instruction}\n\n### Response:\n" |
|
``` |
|
|
|
# Usage |
|
|
|
Ensure that the [crtansformers](https://pypi.org/project/ctransformers/) Python Library is installed: |
|
``` |
|
pip install ctransformers |
|
``` |
|
|
|
Here is a self-contained Python script to get you started: |
|
|
|
``` |
|
from ctransformers import AutoModelForCausalLM |
|
|
|
llm = AutoModelForCausalLM.from_pretrained( |
|
"zoltanctoth/orca_mini_3B-GGUF", model_file="orca-mini-3b.q4_0.gguf" |
|
) |
|
|
|
system = "You are an AI assistant that follows instruction extremely well. Help as much as you can. Give short answers." |
|
instruction = "Which is the biggest city of India?" |
|
|
|
prompt = f"### System:\n{system}\n\n### User:\n{instruction}\n\n### Response:\n" |
|
for i in llm(prompt, stream=True): |
|
print(i, end="", flush=True) |
|
print() |
|
``` |
|
|