Luigi commited on
Commit
5281d1a
Β·
1 Parent(s): 59a2ba9

update readme

Browse files
Files changed (1) hide show
  1. README.md +110 -17
README.md CHANGED
@@ -10,31 +10,124 @@ pinned: false
10
  license: mit
11
  ---
12
 
13
- # RasaBot + Gradio v4 Demo on HuggingFace Spaces (CPU)
14
 
15
- This Space runs a tiny Rasa 2.8.3 assistant inside a **Gradio v4** chat UI on a **CPU** plan.
16
 
17
- ## Files
 
18
 
19
- - **Dockerfile**
20
- Python 3.8 β†’ installs Rasa 2.8.3 + Gradio v4 β†’ `rasa init` β†’ `rasa train` β†’ launches `app.py`.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
21
 
22
- - **requirements.txt**
23
- Pins Rasa and Gradio to v4-compatible versions.
24
 
25
- - **app.py**
26
- Loads the Rasa model and serves a chat UI via Gradio Blocks.
 
27
 
28
- - **README.md** (this file)
29
 
30
- ## Deploy on HuggingFace Spaces
31
 
32
- 1. Create a new Space with **Hardware: CPU** and **Environment: Docker**.
33
- 2. Push these four files to your repo.
34
- 3. The Space will build, train, and expose the Gradio chat at `/`.
35
 
36
- ## Local Testing
37
 
38
  ```bash
39
- docker build -t rasa-gradio-v4 .
40
- docker run -it --rm -p 7860:7860 rasa-gradio-v4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  license: mit
11
  ---
12
 
13
+ # RasaBot Local Deployment Instructions
14
 
15
+ This guide provides clear instructions for running the RasaBot locally on your machine. The setup includes two distinct microservices:
16
 
17
+ * **Rasa Main Service** (requires Python 3.8)
18
+ * **Classifier Backend** (requires Python 3.10)
19
 
20
+ ---
21
+
22
+ ## Prerequisites
23
+
24
+ * Docker
25
+ * Docker Compose
26
+
27
+ Ensure Docker and Docker Compose are installed and running on your system.
28
+
29
+ ---
30
+
31
+ ## Directory Structure
32
+
33
+ Your directory should resemble the following structure:
34
+
35
+ ```text
36
+ RasaBot/
37
+ β”œβ”€β”€ actions/
38
+ β”‚ └── actions.py
39
+ β”œβ”€β”€ app.py
40
+ β”œβ”€β”€ build.sh
41
+ β”œβ”€β”€ classifier/
42
+ β”‚ β”œβ”€β”€ classifier.py
43
+ β”‚ β”œβ”€β”€ Dockerfile
44
+ β”‚ └── requirements.txt
45
+ β”œβ”€β”€ config.yml
46
+ β”œβ”€β”€ custom_components/
47
+ β”‚ └── llm_intent_classifier_client.py
48
+ β”œβ”€β”€ data/
49
+ β”‚ β”œβ”€β”€ nlu.yml
50
+ β”‚ └── rules.yml
51
+ β”œβ”€β”€ Dockerfile
52
+ β”œβ”€β”€ domain.yml
53
+ β”œβ”€β”€ endpoints.yml
54
+ β”œβ”€β”€ envs.sh
55
+ β”œβ”€β”€ models/
56
+ β”œβ”€β”€ README.md
57
+ β”œβ”€β”€ requirements.txt
58
+ β”œβ”€β”€ run_classifier.sh
59
+ └── run_rasa.sh
60
+ ```
61
+
62
+ ---
63
+
64
+ ## Build Docker Images
65
+
66
+ Run the provided build script to build both Docker images:
67
+
68
+ ```bash
69
+ sh build.sh
70
+ ```
71
+
72
+ ---
73
+
74
+ ## Run Services
75
+
76
+ ### Run the Classifier Service
77
+
78
+ Execute the following command:
79
+
80
+ ```bash
81
+ sh run_classifier.sh
82
+ ```
83
+
84
+ This will start the classifier backend service.
85
+
86
+ ### Run the Rasa Main Service
87
 
88
+ Execute the following command in a separate terminal:
 
89
 
90
+ ```bash
91
+ sh run_rasa.sh
92
+ ```
93
 
94
+ This will start the Rasa server along with its Gradio frontend.
95
 
96
+ ---
97
 
98
+ ## Stop the Services
 
 
99
 
100
+ To stop the running services, use:
101
 
102
  ```bash
103
+ docker-compose down
104
+ ```
105
+
106
+ or press `Ctrl+C` in each terminal running a service.
107
+
108
+ ---
109
+
110
+ ## Access the Chat UI
111
+
112
+ Once both services are running, open your browser and navigate to:
113
+
114
+ ```
115
+ http://localhost:7860
116
+ ```
117
+
118
+ You can now interact with your RasaBot via the Gradio UI.
119
+
120
+ ---
121
+
122
+ ## Troubleshooting
123
+
124
+ * Ensure no port conflicts (`7860`, `8000`, `5005`) occur.
125
+ * Verify environment variables in `envs.sh`.
126
+ * Check Docker logs if encountering issues:
127
+
128
+ ```bash
129
+ docker logs <container_name>
130
+ ```
131
+
132
+ ---
133
+