CSAle commited on
Commit
53c31f4
Β·
1 Parent(s): 87af07e

Replace conda references with uv

Browse files
Files changed (4) hide show
  1. Dockerfile +24 -6
  2. README.md +18 -8
  3. pyproject.toml +14 -0
  4. requirements.txt +0 -6
Dockerfile CHANGED
@@ -1,11 +1,29 @@
1
- FROM python:3.9
 
 
 
 
2
  RUN useradd -m -u 1000 user
3
  USER user
 
 
4
  ENV HOME=/home/user \
5
- PATH=/home/user/.local/bin:$PATH
 
 
 
 
6
  WORKDIR $HOME/app
 
 
7
  COPY --chown=user . $HOME/app
8
- COPY ./requirements.txt ~/app/requirements.txt
9
- RUN pip install -r requirements.txt
10
- COPY . .
11
- CMD ["chainlit", "run", "app.py", "--port", "7860"]
 
 
 
 
 
 
 
1
+ # Get a distribution that has uv already installed
2
+ FROM ghcr.io/astral-sh/uv:python3.13-bookworm-slim
3
+
4
+ # Add user - this is the user that will run the app
5
+ # If you do not set user, the app will run as root (undesirable)
6
  RUN useradd -m -u 1000 user
7
  USER user
8
+
9
+ # Set the home directory and path
10
  ENV HOME=/home/user \
11
+ PATH=/home/user/.local/bin:$PATH
12
+
13
+ ENV UVICORN_WS_PROTOCOL=websockets
14
+
15
+ # Set the working directory
16
  WORKDIR $HOME/app
17
+
18
+ # Copy the app to the container
19
  COPY --chown=user . $HOME/app
20
+
21
+ # Install the dependencies
22
+ # RUN uv sync --frozen
23
+ RUN uv sync
24
+
25
+ # Expose the port
26
+ EXPOSE 7860
27
+
28
+ # Run the app
29
+ CMD ["uv", "run", "chainlit", "run", "app.py", "--host", "0.0.0.0", "--port", "7860"]
README.md CHANGED
@@ -54,9 +54,19 @@ That's it! Head to the next step and start building your application!
54
  cd Beyond-ChatGPT
55
  ```
56
 
57
- 3. Install the packages required for this python envirnoment in `requirements.txt`.
58
  ``` bash
59
- pip install -r requirements.txt
 
 
 
 
 
 
 
 
 
 
60
  ```
61
 
62
  4. Open your `.env` file. Replace the `###` in your `.env` file with your OpenAI Key and save the file.
@@ -64,18 +74,18 @@ That's it! Head to the next step and start building your application!
64
  OPENAI_API_KEY=sk-###
65
  ```
66
 
67
- 5. Let's try deploying it locally. Make sure you're in the python environment where you installed Chainlit and OpenAI. Run the app using Chainlit. This may take a minute to run.
68
  ``` bash
69
- chainlit run app.py -w
70
  ```
71
 
72
- <p align = "center" draggable=”false”>
73
  <img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/54bcccf9-12e2-4cef-ab53-585c1e2b0fb5">
74
  </p>
75
 
76
  Great work! Let's see if we can interact with our chatbot.
77
 
78
- <p align = "center" draggable=”false”>
79
  <img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/854e4435-1dee-438a-9146-7174b39f7c61">
80
  </p>
81
 
@@ -87,7 +97,7 @@ Awesome! Time to throw it into a docker container and prepare it for shipping!
87
  <details>
88
  <summary>🐳 Containerizing our App</summary>
89
 
90
- 1. Let's build the Docker image. We'll tag our image as `llm-app` using the `-t` parameter. The `.` at the end means we want all of the files in our current directory to be added to our image.
91
 
92
  ``` bash
93
  docker build -t llm-app .
@@ -101,7 +111,7 @@ Awesome! Time to throw it into a docker container and prepare it for shipping!
101
 
102
  3. Visit http://localhost:7860 in your browser to see if the app runs correctly.
103
 
104
- <p align = "center" draggable=”false”>
105
  <img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/2c764f25-09a0-431b-8d28-32246e0ca1b7">
106
  </p>
107
 
 
54
  cd Beyond-ChatGPT
55
  ```
56
 
57
+ 3. Create a virtual environment and install dependencies.
58
  ``` bash
59
+ # Create a virtual environment
60
+ uv venv
61
+
62
+ # Activate the virtual environment
63
+ # On macOS/Linux:
64
+ source .venv/bin/activate
65
+ # On Windows:
66
+ # .venv\Scripts\activate
67
+
68
+ # Install dependencies from pyproject.toml
69
+ uv sync
70
  ```
71
 
72
  4. Open your `.env` file. Replace the `###` in your `.env` file with your OpenAI Key and save the file.
 
74
  OPENAI_API_KEY=sk-###
75
  ```
76
 
77
+ 5. Let's try deploying it locally. Make sure you're in the activated virtual environment. Run the app using Chainlit. This may take a minute to run.
78
  ``` bash
79
+ uv run chainlit run app.py -w
80
  ```
81
 
82
+ <p align = "center" draggable="false">
83
  <img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/54bcccf9-12e2-4cef-ab53-585c1e2b0fb5">
84
  </p>
85
 
86
  Great work! Let's see if we can interact with our chatbot.
87
 
88
+ <p align = "center" draggable="false">
89
  <img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/854e4435-1dee-438a-9146-7174b39f7c61">
90
  </p>
91
 
 
97
  <details>
98
  <summary>🐳 Containerizing our App</summary>
99
 
100
+ 1. Let's build the Docker image. We'll tag our image as `llm-app` using the `-t` parameter. The `.` at the end means we want all of the files in our current directory to be added to our image. Note that our Dockerfile is set up to use uv for dependency management and will install all the packages defined in our pyproject.toml file.
101
 
102
  ``` bash
103
  docker build -t llm-app .
 
111
 
112
  3. Visit http://localhost:7860 in your browser to see if the app runs correctly.
113
 
114
+ <p align = "center" draggable="false">
115
  <img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/2c764f25-09a0-431b-8d28-32246e0ca1b7">
116
  </p>
117
 
pyproject.toml ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [project]
2
+ name = "beyond-chatgpt"
3
+ version = "0.1.0"
4
+ description = "Add your description here"
5
+ readme = "README.md"
6
+ requires-python = ">=3.13"
7
+ dependencies = [
8
+ "chainlit==0.7.700",
9
+ "cohere==4.37",
10
+ "openai==1.3.5",
11
+ "pydantic==2.10.1",
12
+ "python-dotenv==1.0.0",
13
+ "tiktoken==0.5.1",
14
+ ]
requirements.txt DELETED
@@ -1,6 +0,0 @@
1
- chainlit==0.7.700
2
- cohere==4.37
3
- openai==1.3.5
4
- tiktoken==0.5.1
5
- python-dotenv==1.0.0
6
- pydantic==2.10.1