chainlit_doc / README.md
heyday1234's picture
Update README.md
6dcbb43 verified
---
license: mit
title: 'Medical Document Assistant APP with LLM RAG framework'
sdk: docker
emoji: πŸ“š
colorFrom: blue
colorTo: red
pinned: false
short_description: Search medical terms among uploaded document
app_port: 8080
---
## Necessary resources
### Model must be downloaded to local ai_workshop folder:
Llama 2 Model (Quantized one by the Bloke): https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/blob/main/llama-2-7b-chat.ggmlv3.q8_0.bin
### License and other reference
The code in all scripts subjects to a licence of 96harsh52/LLaMa_2_chatbot (https://github.com/96harsh52/LLaMa_2_chatbot)
Youtube instruction (https://www.youtube.com/watch?v=kXuHxI5ZcG0&list=PLrLEqwuz-mRIdQrfeCjeCyFZ-Pl6ffPIN&index=18)
Llama 2 HF Model (Original One): https://huggingface.co/meta-llama
Chainlit docs: https://github.com/Chainlit/chainlit
## Create virtual Environment
1. Create Virtual env:
>`cd ai_workshop`
>`python -m venv langchain`
2. Activate virtual evn:
>`langchain\Scripts\activate`
*NOTE: if you see the read warning in cmd terminal said "running scripts is disabled on this system" , use Powershell to setup API server:
1. open Powershell
> `Set-ExecutionPolicy Unrestricted -Scope Process`
2. activate virtual env as previous steps
3. install requirements.txt
> `python -m ensurepip --upgrade`
> `python -m pip install --upgrade setuptools`
> `python -m pip install -r requirements.txt`
## Create local vectors storage database
After activate virtual environment, run `python .\ingest.py`
## Setup Medical chatbot server with chainlit
After set up the database folder of "vectorstore/db_faiss", run `chainlit run .\model.py > logs.txt`