Rishabh IO commited on
Commit
5f1f14e
·
1 Parent(s): 6595181

First deploy to HF Spaces

Browse files
Files changed (8) hide show
  1. Dockerfile +22 -0
  2. LICENSE +21 -0
  3. README.md +193 -11
  4. main.py +47 -0
  5. model.joblib +0 -0
  6. model.py +25 -0
  7. requirements.txt +17 -0
  8. static/index.html +115 -0
Dockerfile ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+
3
+ # Use Python base image
4
+ FROM --platform=linux/amd64 python:latest
5
+
6
+ # Set working directory in the container
7
+ WORKDIR /app
8
+
9
+ # Copy requirements.txt to container
10
+ COPY requirements.txt .
11
+
12
+ # Install dependencies
13
+ RUN pip install --no-cache-dir -r requirements.txt
14
+
15
+ # Copy the FastAPI app files to the container
16
+ COPY . /app
17
+
18
+ # Expose port 80 for FastAPI app
19
+ EXPOSE 7860
20
+
21
+ # Command to start the FastAPI app
22
+ CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "7860"]
LICENSE ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MIT License
2
+
3
+ Copyright (c) 2024 rishabh.io
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
README.md CHANGED
@@ -1,11 +1,193 @@
1
- ---
2
- title: IrisModel
3
- emoji: 📊
4
- colorFrom: green
5
- colorTo: blue
6
- sdk: docker
7
- pinned: false
8
- license: mit
9
- ---
10
-
11
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # packaging-ml-model
2
+ Learn how to package a machine learning model into a container
3
+
4
+
5
+
6
+ # Steps to use the repository
7
+
8
+ 1. Clone the repository
9
+ 2. Create a virtual environment ( to isolate the dependencies )
10
+ 3. Install the requirements with the following command:
11
+
12
+ ```
13
+
14
+ pip install -r requirements.txt
15
+ ```
16
+
17
+ # Build the model file
18
+ 1. Execute the following command to build the model
19
+
20
+ ```
21
+ python model.py
22
+
23
+ ```
24
+
25
+ - This will build the model and serialize it into a file called
26
+ as model.joblib, this is what we'll load into memory when we
27
+ build our inference API via fastAPI
28
+
29
+ # Build a fastAPI based app
30
+
31
+ - The source code for this is available in the app.py file
32
+ - You can check whether it's working by executing the following
33
+ command:
34
+
35
+ ```
36
+ uvicorn main:app --reload
37
+
38
+ ```
39
+
40
+ # Generate a Docker file
41
+
42
+ Generate Dockerfile in the same directory as the app and add the
43
+ following contents to it:
44
+
45
+ ```
46
+
47
+ # Use Python base image
48
+ FROM python:3.9-slim
49
+
50
+ # Set working directory in the container
51
+ WORKDIR /app
52
+
53
+ # Copy requirements.txt to container
54
+ COPY requirements.txt .
55
+
56
+ # Install dependencies
57
+ RUN pip install --no-cache-dir -r requirements.txt
58
+
59
+ # Copy the FastAPI app files to the container
60
+ COPY app /app
61
+
62
+ # Expose port 80 for FastAPI app
63
+ EXPOSE 80
64
+
65
+ # Command to start the FastAPI app
66
+ CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "80"]
67
+
68
+ ```
69
+
70
+ # Build the Docker Image from the instructions given in the Dockerfile
71
+
72
+ ```
73
+ docker build -t packaged-model-001 .
74
+ ```
75
+
76
+ # Build the container out of the image
77
+ ```
78
+ docker run -p 8000:80 packaged-model-001
79
+ ```
80
+
81
+
82
+ # Verify whether the container is running
83
+
84
+ - Use postman to call the post end-point available at localhost:8000/predict
85
+
86
+ ```
87
+
88
+ {
89
+ "sepal_length": 2,
90
+ "sepal_width": 3.0,
91
+ "petal_length": 4.0,
92
+ "petal_width": 1.5
93
+ }
94
+
95
+ ```
96
+
97
+
98
+ # push the image to docker registry
99
+
100
+ 1. Login to Docker
101
+
102
+ ```
103
+
104
+ docker login
105
+
106
+ ```
107
+
108
+ - The above command should show text like the following:
109
+
110
+ ```
111
+ Log in with your Docker ID or email address to push and pull images from Docker Hub. If you don't have a Docker ID, head over to https://hub.docker.com/ to create one.
112
+ You can log in with your password or a Personal Access Token (PAT). Using a limited-scope PAT grants better security and is required for organizations using SSO. Learn more at https://docs.docker.com/go/access-tokens/
113
+
114
+ Username:
115
+ ```
116
+
117
+ - Use the PAT as the password
118
+ ** Note: You can also use your password but the use of PAT with minimal access is recommended.
119
+
120
+
121
+ # Create a repository on DockerHub
122
+
123
+ ```
124
+ 1. Go to Dockerhub ( search on Google )
125
+
126
+ 2. Create an Account if not already existing
127
+
128
+ 3. Create a new Repository
129
+
130
+ ```
131
+
132
+
133
+ # Tag your local image
134
+
135
+ ```
136
+
137
+ docker tag packaged-model-001:latest riio/packaged-model-001:latest
138
+
139
+ ```
140
+
141
+
142
+ ## Push the image to DockerHub
143
+
144
+ ```
145
+ docker push riio/packaged-model-001:latest
146
+ ```
147
+
148
+
149
+ ## Sample Output:
150
+
151
+ ```
152
+
153
+ (venv) username@machine packaging-ml-model % docker push riio/packaged-model-001:latest
154
+ The push refers to repository [docker.io/riio/packaged-model-001]
155
+ fd749012a9d2: Pushed
156
+ 963141bae3f4: Pushing 253.5MB
157
+ 4f83a3ffc58c: Pushed
158
+ 7b34bc82ecfd: Pushed
159
+ b958f60e4e67: Pushed
160
+ f02ce41627b1: Pushed
161
+ eeac00a5e55e: Pushed
162
+ 34e7752745be: Pushed
163
+ 8560597d922c: Pushing 100.2MB
164
+ ```
165
+
166
+
167
+ ## Congratulations.
168
+
169
+ - You just packaged your machine learning model and made it available to the world with the power of containers.
170
+
171
+ ## How anybody can use your packaged model?
172
+
173
+ It's simple
174
+
175
+ ```
176
+ docker pull riio/packaged-model-001:latest
177
+
178
+ ```
179
+
180
+
181
+ ## How to run the container ?
182
+
183
+ ```
184
+
185
+ docker run -p 8000:80 riio/packaged-model-001:latest
186
+ ```
187
+
188
+
189
+
190
+ # Link to access the Container:
191
+
192
+
193
+ - https://model-deployment-005.purplecliff-0cc0d310.centralindia.azurecontainerapps.io/predict
main.py ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+
3
+
4
+ # app/main.py
5
+
6
+ from fastapi import FastAPI
7
+ from pydantic import BaseModel
8
+ from joblib import load
9
+ import numpy as np
10
+ from fastapi.responses import HTMLResponse
11
+
12
+
13
+ # Define FastAPI app
14
+ app = FastAPI()
15
+
16
+ # Load the trained model
17
+ model = load("model.joblib")
18
+
19
+ # Define request body schema using Pydantic BaseModel
20
+ class Item(BaseModel):
21
+ sepal_length: float
22
+ sepal_width: float
23
+ petal_length: float
24
+ petal_width: float
25
+
26
+ # Define endpoint to make predictions
27
+ @app.post("/predict")
28
+ async def predict(item: Item):
29
+ # Convert input to array
30
+ input_data = [item.sepal_length, item.sepal_width, item.petal_length, item.petal_width]
31
+ input_array = np.array([input_data])
32
+
33
+ # Make prediction
34
+ prediction = model.predict(input_array)[0]
35
+
36
+ # Map prediction to class label
37
+ class_label = {0: "setosa", 1: "versicolor", 2: "virginica"}
38
+ predicted_class = class_label[prediction]
39
+
40
+ # Return prediction
41
+ return {"predicted_class": predicted_class}
42
+
43
+
44
+ @app.get('/app', response_class=HTMLResponse)
45
+ async def html():
46
+ content = open('static/index.html', 'r')
47
+ return content.read()
model.joblib ADDED
Binary file (187 kB). View file
 
model.py ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ # app/model.py
3
+
4
+ from sklearn.datasets import load_iris
5
+ from sklearn.model_selection import train_test_split
6
+ from sklearn.ensemble import RandomForestClassifier
7
+ from joblib import dump
8
+
9
+ # Load the Iris dataset
10
+ iris = load_iris()
11
+ X, y = iris.data, iris.target
12
+
13
+ # Split the dataset into training and testing sets
14
+ X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
15
+
16
+ # Train a Random Forest classifier
17
+ model = RandomForestClassifier(n_estimators=100, random_state=42)
18
+ model.fit(X_train, y_train)
19
+
20
+ # Evaluate the model
21
+ accuracy = model.score(X_test, y_test)
22
+ print("Model accuracy:", accuracy)
23
+
24
+ # Save the trained model as a joblib file
25
+ dump(model, "model.joblib")
requirements.txt ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ annotated-types==0.6.0
2
+ anyio==4.3.0
3
+ click==8.1.7
4
+ fastapi==0.110.3
5
+ h11==0.14.0
6
+ idna==3.7
7
+ joblib==1.4.0
8
+ numpy==1.26.4
9
+ pydantic==2.7.1
10
+ pydantic_core==2.18.2
11
+ scikit-learn==1.4.2
12
+ scipy==1.13.0
13
+ sniffio==1.3.1
14
+ starlette==0.37.2
15
+ threadpoolctl==3.5.0
16
+ typing_extensions==4.11.0
17
+ uvicorn==0.29.0
static/index.html ADDED
@@ -0,0 +1,115 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <!DOCTYPE html>
2
+ <html lang="en">
3
+ <head>
4
+ <meta charset="UTF-8">
5
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
6
+ <title>Flower Predictor</title>
7
+ <style>
8
+ body {
9
+ font-family: Arial, sans-serif;
10
+ margin: 0;
11
+ padding: 0;
12
+ display: flex;
13
+ justify-content: center;
14
+ align-items: center;
15
+ height: 100vh;
16
+ background-color: #f2f2f2;
17
+ }
18
+
19
+ form {
20
+ background-color: #fff;
21
+ padding: 20px;
22
+ border-radius: 10px;
23
+ box-shadow: 0 4px 8px rgba(0, 0, 0, 0.1);
24
+ }
25
+
26
+ label {
27
+ display: block;
28
+ margin-bottom: 10px;
29
+ }
30
+
31
+ input[type="text"] {
32
+ width: 100%;
33
+ padding: 10px;
34
+ margin-bottom: 20px;
35
+ border: 1px solid #ccc;
36
+ border-radius: 5px;
37
+ box-sizing: border-box;
38
+ }
39
+
40
+ input[type="submit"] {
41
+ background-color: #4CAF50;
42
+ color: white;
43
+ padding: 10px 20px;
44
+ border: none;
45
+ border-radius: 5px;
46
+ cursor: pointer;
47
+ }
48
+
49
+ input[type="submit"]:hover {
50
+ background-color: #45a049;
51
+ }
52
+ </style>
53
+ </head>
54
+ <body>
55
+ <div>
56
+ <h1>Flower Predictor</h1>
57
+ <p>Enter the flower measurements to predict the class of the flower.</p>
58
+ <p> The predicted class is: <span id="predicted_class"></span> </p>
59
+ </div>
60
+ <form id="flowerForm">
61
+ <label for="petal_length">Petal Length:</label>
62
+ <input type="text" id="petal_length" name="petal_length" required>
63
+
64
+ <label for="sepal_length">Sepal Length:</label>
65
+ <input type="text" id="sepal_length" name="sepal_length" required>
66
+
67
+ <label for="petal_width">Petal Width:</label>
68
+ <input type="text" id="petal_width" name="petal_width" required>
69
+
70
+ <label for="sepal_width">Sepal Width:</label>
71
+ <input type="text" id="sepal_width" name="sepal_width" required>
72
+
73
+ <input type="submit" value="Predict">
74
+ </form>
75
+
76
+ <script>
77
+ document.getElementById("flowerForm").addEventListener("submit", function(event) {
78
+ event.preventDefault(); // Prevent default form submission
79
+
80
+ // Prepare data for API call
81
+ const formData = new FormData(event.target);
82
+ const requestData = {};
83
+ formData.forEach((value, key) => {
84
+ requestData[key] = value;
85
+ });
86
+
87
+ // Make API call
88
+ fetch("/predict", {
89
+ method: "POST",
90
+ headers: {
91
+ "Content-Type": "application/json"
92
+ },
93
+ body: JSON.stringify(requestData)
94
+ })
95
+ .then(response => {
96
+ if (!response.ok) {
97
+ throw new Error("Network response was not ok");
98
+ }
99
+ return response.json();
100
+ })
101
+ .then(data => {
102
+ // Handle API response here
103
+ console.log(data);
104
+ document.getElementById("predicted_class").innerText = data.predicted_class;
105
+ // alert("Prediction: " + data.predicted_class);
106
+ })
107
+ .catch(error => {
108
+ console.error("Error:", error);
109
+ alert("An error occurred. Please try again.");
110
+ });
111
+ });
112
+ </script>
113
+
114
+ </body>
115
+ </html>