Spaces:
Sleeping
Sleeping
Update README.md
Browse filesEdit metadata in README.md
README.md
CHANGED
@@ -1,39 +1,49 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
39 |
After set up the database folder of "vectorstore/db_faiss", run `chainlit run .\model.py > logs.txt`
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
title: '"Medical Document Assistant APP with LLM RAG framework'
|
4 |
+
sdk: docker
|
5 |
+
emoji: 📚
|
6 |
+
colorFrom: blue
|
7 |
+
colorTo: red
|
8 |
+
pinned: false
|
9 |
+
short_description: Search medical terms among uploaded document
|
10 |
+
---
|
11 |
+
## Necessary resources
|
12 |
+
|
13 |
+
### Model must be downloaded to local ai_workshop folder:
|
14 |
+
Llama 2 Model (Quantized one by the Bloke): https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/blob/main/llama-2-7b-chat.ggmlv3.q8_0.bin
|
15 |
+
|
16 |
+
### License and other reference
|
17 |
+
The code in all scripts subjects to a licence of 96harsh52/LLaMa_2_chatbot (https://github.com/96harsh52/LLaMa_2_chatbot)
|
18 |
+
Youtube instruction (https://www.youtube.com/watch?v=kXuHxI5ZcG0&list=PLrLEqwuz-mRIdQrfeCjeCyFZ-Pl6ffPIN&index=18)
|
19 |
+
|
20 |
+
Llama 2 HF Model (Original One): https://huggingface.co/meta-llama
|
21 |
+
Chainlit docs: https://github.com/Chainlit/chainlit
|
22 |
+
|
23 |
+
## Create virtual Environment
|
24 |
+
|
25 |
+
1. Create Virtual env:
|
26 |
+
>`cd ai_workshop`
|
27 |
+
>`python -m venv langchain`
|
28 |
+
|
29 |
+
2. Activate virtual evn:
|
30 |
+
>`langchain\Scripts\activate`
|
31 |
+
|
32 |
+
*NOTE: if you see the read warning in cmd terminal said "running scripts is disabled on this system" , use Powershell to setup API server:
|
33 |
+
1. open Powershell
|
34 |
+
> `Set-ExecutionPolicy Unrestricted -Scope Process`
|
35 |
+
2. activate virtual env as previous steps
|
36 |
+
|
37 |
+
3. install requirements.txt
|
38 |
+
> `python -m ensurepip --upgrade`
|
39 |
+
> `python -m pip install --upgrade setuptools`
|
40 |
+
> `python -m pip install -r requirements.txt`
|
41 |
+
|
42 |
+
|
43 |
+
## Create local vectors storage database
|
44 |
+
|
45 |
+
After activate virtual environment, run `python .\ingest.py`
|
46 |
+
|
47 |
+
## Setup Medical chatbot server with chainlit
|
48 |
+
|
49 |
After set up the database folder of "vectorstore/db_faiss", run `chainlit run .\model.py > logs.txt`
|