Spaces:
Sleeping
Sleeping
File size: 4,441 Bytes
984bae5 a10e2f7 0b19c40 a10e2f7 0b19c40 a10e2f7 984bae5 a10e2f7 0b19c40 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 |
---
title: LanggraphAgenticAI
emoji: π¨
colorFrom: blue
colorTo: red
sdk: streamlit
sdk_version: 1.42.0
app_file: app.py
pinned: false
---
# Project Setup
## Steps
### 1. Open a New Folder
### 2. Create a New Virtual Environment
### 3. Install Requirements
```sh
pip install -r requirements.txt
```
### 4. Define Folder Structure & Commit Code to GitHub
#### a. Create a New Repository
```sh
git init
```
#### b. Add `.gitignore` and Include `venv/`
#### c. Commit One File to Establish Connection with GitHub
#### d. Verify Initial Commit in GitHub
### 5. Create `src` Folder and Make it a Package
```sh
mkdir src
```
Add `__init__.py` to make it a package.
### 6. Define Project Structure
Inside `src/`, create a folder `langgraphagenticai/` and add `__init__.py`. Inside it, create:
- `LLMs/` - `__init__.py`
- `Nodes/` - `__init__.py`
- `Graph/` - `__init__.py`
- `State/` - `__init__.py`
- `Tools/` - `__init__.py`
- `Vectorstore/` - `__init__.py`
- `ui/` - `__init__.py`
- Lastly, add `main.py`
### 7. Create `app.py` Outside `src/`
This is the main execution entry point.
## Start Writing Code
### 8. LLM Setup
#### a. Inside `LLMs/`, create `qroqllm.py` to load LLM via Streamlit UI.
### 9. UI Setup
#### a. Inside `ui/`, create:
- `loadui.py`
- `display_result.py`
#### b. Create a Config File
Inside `ui/`, create `uiconfigfile.ini` (plain text key-value pairs).
#### c. Create a Config Parser
Inside `ui/streamlit_ui/`, create `uiconfigfile.py` to read from `uiconfigfile.ini`.
#### d. Implement `loadui.py`
```python
from src.langgraphagenticai.ui.uiconfigfile import Config
```
- Load sidebar options from the config file
- Store session state
- Load UI components (left & right sections)
### 10. Running the UI
Run the UI to check if it loads correctly:
```sh
streamlit run app.py
```
### 11. Implement `main.py`
#### a. Execution starts here based on user choices.
#### b. Load the LLM in `qroqllm.py`.
#### c. Implement `graph_builder.py` inside `Graph/`.
- Import `State` inside `graph_builder.py`
#### d. Implement `Nodes`
- Start with `BasicChatbotNode`.
```python
from src.langgraphagenticai.nodes.basic_chatbot_node import BasicChatbotNode
```
- Define edges, start, and end functions.
#### e. Connect Graph in `main.py`
```python
from src.langgraphagenticai.graph.graph_builder import GraphBuilder
```
- Invoke the graph based on use case.
- Compile and stream results.
#### f. Update Display Results
Modify `display_result.py` to update UI results and call it in `main.py`.
```python
from src.langgraphagenticai.ui.streamlitui.display_result import DisplayResultStreamlit
```
### 12. Run the Application
```sh
streamlit run app.py
```
# π CI/CD Setup: GitHub to Hugging Face
## πΉ 1. Set Up CI/CD Between GitHub and Hugging Face
- **Create** a new Space on Hugging Face.
- A **unique URL** will be generated for your Space.
- Open **`main.yml`** and update the **last line**:
- Replace it with your Hugging Face Space **unique URL** (it will include `$HF_TOKEN`).
- **Update the README**:
- The **first section** of the README should contain the **Streamlit configuration**.
## πΉ 2. Deploy on Streamlit
- **Go to the Streamlit website** and add your GitHub repo and branch details.
- Deployment example:
π [LangGraph Streamlit App](https://langgraphe2e-4cg4bxcc3ysrhwrmregluy.streamlit.app/)
---
Your CI/CD pipeline should now be set up for seamless deployment! πβ¨
---
# Chatbot with Tool Integration
## π Steps to Implement
### πΉ 1. Add a Search Tool
- Navigate to the `TOOLS` folder.
- **Create** a new file: `search_tool.py`.
- **Add Tavily** as a tool inside this file.
### πΉ 2. Create a Chatbot with Tool Node
- Navigate to the `NODES` folder.
- **Create** a new file: `Chatbot_with_tool_node.py`.
- Implement chatbot logic with tool integration inside this file.
### πΉ 3. Modify Graph Builder
- Open `graph_builder.py`.
- **Edit** the file to include tool functionality.
### πΉ 4. Update the UI
- Modify the following UI components:
- `Load_streamlit_ui.py`
- `uiconfig.ini`
- `display_results.py`
### πΉ 5. Run the Application
To start the chatbot, run the following command:
streamlit run app.py
π Now your chatbot with tool integration should be live!
## Ouput Screenshots
Streamlit UI interface showcasing chatbot utilizing tools

|