Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,67 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Installation
|
2 |
+
|
3 |
+
|
4 |
+
```
|
5 |
+
%%capture
|
6 |
+
!CMAKE_ARGS="-DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python
|
7 |
+
|
8 |
+
|
9 |
+
```
|
10 |
+
# GPU llama-cpp-python
|
11 |
+
!CMAKE_ARGS="-DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir --verbose
|
12 |
+
|
13 |
+
```
|
14 |
+
|
15 |
+
%%capture
|
16 |
+
!pip install huggingface-hub hf-transfer
|
17 |
+
|
18 |
+
|
19 |
+
```
|
20 |
+
import os
|
21 |
+
os.environ["HF_HUB_ENABLE_HF_TRANSFER"] = "1"
|
22 |
+
!huggingface-cli download \
|
23 |
+
ruslanmv/Medical-Llama3-8B-GGUF \
|
24 |
+
medical-llama3-8b.Q5_K_M.gguf \
|
25 |
+
--local-dir . \
|
26 |
+
--local-dir-use-symlinks False
|
27 |
+
|
28 |
+
MODEL_PATH="/content/medical-llama3-8b.Q5_K_M.gguf"
|
29 |
+
```
|
30 |
+
|
31 |
+
|
32 |
+
|
33 |
+
Example of use
|
34 |
+
|
35 |
+
```
|
36 |
+
from llama_cpp import Llama
|
37 |
+
import json
|
38 |
+
B_INST, E_INST = "<s>[INST]", "[/INST]"
|
39 |
+
B_SYS, E_SYS = "<<SYS>>\n", "\n<</SYS>>\n\n"
|
40 |
+
DEFAULT_SYSTEM_PROMPT = """\
|
41 |
+
You are an AI Medical Chatbot Assistant, I'm equipped with a wealth of medical knowledge derived from extensive datasets. I aim to provide comprehensive and informative responses to your inquiries. However, please note that while I strive for accuracy, my responses should not replace professional medical advice and short answers.
|
42 |
+
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information."""
|
43 |
+
SYSTEM_PROMPT = B_SYS + DEFAULT_SYSTEM_PROMPT + E_SYS
|
44 |
+
def create_prompt(user_query):
|
45 |
+
instruction = f"User asks: {user_query}\n"
|
46 |
+
prompt = B_INST + SYSTEM_PROMPT + instruction + E_INST
|
47 |
+
return prompt.strip()
|
48 |
+
|
49 |
+
|
50 |
+
user_query = "I'm a 35-year-old male experiencing symptoms like fatigue, increased sensitivity to cold, and dry, itchy skin. Could these be indicative of hypothyroidism?"
|
51 |
+
prompt = create_prompt(user_query)
|
52 |
+
print(prompt)
|
53 |
+
|
54 |
+
llm = Llama(model_path=MODEL_PATH, n_gpu_layers=-1)
|
55 |
+
result = llm(
|
56 |
+
prompt=prompt,
|
57 |
+
max_tokens=100,
|
58 |
+
echo=False
|
59 |
+
)
|
60 |
+
print(result['choices'][0]['text'])
|
61 |
+
```
|
62 |
+
|
63 |
+
```
|
64 |
+
Hi, thank you for your query.
|
65 |
+
Hypothyroidism is characterized by fatigue, sensitivity to cold, weight gain, depression, hair loss and mental dullness. I would suggest that you get a complete blood count with thyroid profile including TSH (thyroid stimulating hormone), free thyroxine level, and anti-thyroglobulin antibodies. These tests will help in establishing the diagnosis of hypothyroidism.
|
66 |
+
If there is no family history of autoimmune disorders, then it might be due
|
67 |
+
``
|