Sukanth07 commited on
Commit
8f6d05a
·
1 Parent(s): c5c7dc8

"Updated README.md with project description, structure, setup instructions, usage, core functionality, and other details."

Browse files
.gitignore ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ .env
2
+ /venv
3
+ .gradio
4
+ **/__pycache__/
5
+ app/__pycache__/llm.cpython-310.pyc
6
+ logs/application.log
README.md CHANGED
@@ -1,13 +1,166 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- title: Lumpy Skin Disease Pred
3
- emoji: 📈
4
- colorFrom: red
5
- colorTo: indigo
6
- sdk: gradio
7
- sdk_version: 5.5.0
8
- app_file: app.py
9
- pinned: false
10
- license: mit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
  ---
12
 
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Lumpy Skin Disease Prediction System
2
+
3
+ This project is an AI-based diagnostic system for detecting Lumpy Skin Disease in animals using Machine Learning (ML), Convolutional Neural Networks (CNN), and Large Language Model (LLM) integration. The system allows users to upload images and provide various input features, after which the model analyzes the data and provides a detailed report.
4
+
5
+ ---
6
+
7
+ ## Project Screenshot
8
+
9
+ ![Project Screenshot](screenshots/Screenshot(69).png)
10
+
11
+ ---
12
+
13
+ ## Project Structure (Inside 'app' Directory)
14
+
15
+ - **app.py** - Main Gradio app script for the user interface.
16
+ - **predict.py** - Contains the prediction logic and interacts with both ML and CNN models.
17
+ - **llm.py** - Utilizes the LLM to generate detailed diagnostic reports.
18
+ - **config.py** - Configuration file with paths and environment variables.
19
+ - **utils.py** - Contains other utility functions and properties.
20
+ - **exception.py** - Custom exception handling for various error scenarios.
21
+ - **logger.py** - Configures logging across the application.
22
+
23
+ ---
24
+
25
+ ## Setup Instructions
26
+
27
+ ### Prerequisites
28
+ - Python 3.10
29
+ - Virtual environment (recommended)
30
+ - `pip` package manager
31
+
32
+ ### Install Dependencies
33
+
34
+ 1. **Clone the Repository**:
35
+ ```bash
36
+ git clone https://github.com/Sukanth07/lumpy-skin-disease-prediction.git
37
+ cd lumpy-skin-disease-prediction
38
+ ```
39
+
40
+ 2. **Set Up Virtual Environment**:
41
+ ```bash
42
+ python -m venv venv
43
+ source venv/bin/activate # For Linux
44
+ .\venv\Scripts\activate # For Windows
45
+ ```
46
+
47
+ 3. **Install Required Packages**:
48
+ ```bash
49
+ pip install -r requirements.txt
50
+ ```
51
+
52
+ 4. **Protobuf Builder File Config**:
53
+ Since there is dependency issue with protobuf, we will manually copy and paste the latest builder file in library path.
54
+ If you are using Linux based OS, this can be handled programmatically at the start of the app.py
55
+ ```bash
56
+ cp "./builder.py" "./venv/Lib/site-packages/google/protobuf/internel/" # For Linux
57
+ copy ".\\builder.py" ".\\venv\Lib\\site-packages\\google\\protobuf\\internal\\" # For Windows
58
+ ```
59
+
60
+ 5. **Environment Variables**:
61
+ Create a `.env` file with your API keys and other sensitive data:
62
+ ```
63
+ GEMINI_API_KEY=your-gemini-api-key
64
+ ```
65
+ You can get the free Gemini API key with this link: https://aistudio.google.com/app/apikey
66
+
67
+ ---
68
+
69
+ ## Usage
70
+
71
+ ### Running the Gradio App
72
+ Start the Gradio app by running the following command:
73
+ ```bash
74
+ python app.py
75
+ ```
76
+
77
+ To continuosly run and update the app, use the following command:
78
+ ```bash
79
+ gradio app.py
80
+ ```
81
+
82
+ The app will launch in your default browser. You can upload images and provide additional input data to get a comprehensive diagnostic report.
83
+
84
+ ---
85
+
86
+ ## Core Functionality
87
+ - **Machine Learning Prediction** - Predicts the presence of Lumpy Skin Disease based on structured input data.
88
+ - **CNN Image Analysis** - Analyzes images to classify them as 'Lumpy' or 'Not Lumpy'.
89
+ - **LLM Diagnostic Report** - Uses a language model to generate a detailed report based on model predictions and input data.
90
+
91
  ---
92
+
93
+ ## Project Components
94
+
95
+ 1. **app.py**
96
+ - Main entry point for the Gradio UI.
97
+ - Displays input fields, image upload options, and prediction results.
98
+
99
+ 2. **predict.py**
100
+ - Implements the `DataPreprocessor`, `ML_Model_Predictor`, and `CNN_Model_Predictor` classes.
101
+ - Orchestrates data preprocessing, model prediction, and combines outputs for the LLM report generation.
102
+
103
+ 3. **llm.py**
104
+ - Responsible for configuring and generating predictions using the LLM.
105
+ - Formats the results in a user-friendly diagnostic report.
106
+
107
+ 4. **logger.py**
108
+ - Sets up logging configurations for the project.
109
+ - Ensures that logs are stored in `logs/app.log` with error level and timestamps.
110
+
111
+ 5. **exception.py**
112
+ - Contains custom exceptions for model loading, preprocessing, and inference.
113
+ - Includes a utility function `log_exception` to standardize error logging.
114
+
115
  ---
116
 
117
+ ## Exception Handling and Logging
118
+
119
+ - **Logging**: All logs are recorded in `logs/app.log` with timestamps and log levels.
120
+
121
+ - **Custom Exceptions**: The project uses `exception.py` to handle specific errors, such as:
122
+ - `ModelLoadingError`: Raised when a model fails to load.
123
+ - `PreprocessingError`: Raised during data preprocessing issues.
124
+ - `PredictionError`: Raised for issues in the ML or CNN prediction processes.
125
+ - `APIError`: Raised when the LLM API encounters an error.
126
+
127
+ ---
128
+
129
+ ## Commands
130
+
131
+ - **Run App**:
132
+ ```bash
133
+ python app.py
134
+ ```
135
+
136
+ - **Install Packages**:
137
+ ```bash
138
+ pip install -r requirements.txt
139
+ ```
140
+
141
+ - **Deactivate Virtual Environment**:
142
+ ```bash
143
+ deactivate
144
+ ```
145
+
146
+ ---
147
+
148
+ ## Future Improvements
149
+
150
+ - **Model Optimization**: Further enhance ML and CNN models for faster predictions.
151
+
152
+ - **LLM Fine-tuning**: Use domain-specific data to improve the LLM's understanding of Lumpy Skin Disease.
153
+
154
+ - **Expanded Input**: Integrate more diverse data inputs for robust analysis.
155
+
156
+ ---
157
+
158
+ ## License
159
+
160
+ This project is licensed under the MIT License. See `LICENSE` for more details.
161
+
162
+ ---
163
+
164
+ ## Contributors
165
+
166
+ - **Sukanth K** - [github.com/Sukanth07](https://github.com/Sukanth07)
app.py ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+ import warnings
3
+ import src.utils as utils
4
+ from src.predict import prediction
5
+ from src.logger import get_logger
6
+
7
+ logger = get_logger(__name__)
8
+ warnings.filterwarnings("ignore")
9
+
10
+ utils.copy_builder()
11
+
12
+ def show_processing_text():
13
+ return gr.update(visible=True), gr.update(visible=False)
14
+
15
+ def prediction_with_loading(image, longitude, latitude, cloud_cover, evapotranspiration, precipitation, min_temp, mean_temp, max_temp, vapour_pressure, wet_day_freq):
16
+ try:
17
+ logger.info("Starting prediction process...")
18
+ response = prediction(
19
+ image, longitude, latitude, cloud_cover, evapotranspiration,
20
+ precipitation, min_temp, mean_temp, max_temp, vapour_pressure, wet_day_freq
21
+ )
22
+ logger.info("Prediction completed successfully.")
23
+ return gr.update(value=response, visible=True), gr.update(visible=False)
24
+ except Exception as e:
25
+ logger.error(f"Error in prediction: {str(e)}")
26
+ return "An error occurred during prediction. Please try again.", gr.update(visible=False)
27
+
28
+ with gr.Blocks(css=utils.css, theme=gr.themes.Ocean(primary_hue=gr.themes.colors.red, secondary_hue=gr.themes.colors.pink)) as demo:
29
+ gr.Markdown("<div class='title'>LUMPY SKIN DISEASE PREDICTION</div>")
30
+
31
+ with gr.Row():
32
+ with gr.Column(scale=5):
33
+ # Image upload
34
+ image_input = gr.Image(type="pil", label="Upload Image", height=177)
35
+
36
+ with gr.Column(scale=5):
37
+ longitude = gr.Number(label="Longitude")
38
+ latitude = gr.Number(label="Latitude")
39
+
40
+ with gr.Row():
41
+ with gr.Column(scale=5):
42
+ cloud_cover = gr.Number(label="Monthly Cloud Cover")
43
+ evapotranspiration = gr.Number(label="Potential EvapoTranspiration")
44
+
45
+ with gr.Column(scale=5):
46
+ precipitation = gr.Number(label="Precipitation")
47
+ min_temp = gr.Number(label="Minimum Temperature")
48
+
49
+ with gr.Column(scale=5):
50
+ mean_temp = gr.Number(label="Mean Temperature")
51
+ max_temp = gr.Number(label="Maximum Temperature")
52
+
53
+ with gr.Column(scale=5):
54
+ vapour_pressure = gr.Number(label="Vapour Pressure")
55
+ wet_day_freq = gr.Number(label="Wet Day Frequency")
56
+
57
+ with gr.Row():
58
+ predict_button = gr.Button("Predict", variant="primary")
59
+
60
+ processing_text = gr.Markdown("", visible=False, height=100)
61
+ output_text = gr.Markdown(label="LLM Generated Diagnostic Report", container=True, show_copy_button=True, visible=False)
62
+
63
+ predict_button.click(
64
+ fn=show_processing_text,
65
+ inputs=[],
66
+ outputs=[processing_text, output_text],
67
+ queue=False
68
+ )
69
+ predict_button.click(
70
+ fn=prediction_with_loading,
71
+ inputs=[image_input, longitude, latitude, cloud_cover, evapotranspiration, precipitation, min_temp, mean_temp, max_temp, vapour_pressure, wet_day_freq],
72
+ outputs=[output_text, processing_text],
73
+ queue=True
74
+ )
75
+
76
+ demo.launch(share=True)
builder.py ADDED
@@ -0,0 +1,268 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Protocol Buffers - Google's data interchange format
2
+ # Copyright 2008 Google Inc. All rights reserved.
3
+ # https://developers.google.com/protocol-buffers/
4
+ #
5
+ # Redistribution and use in source and binary forms, with or without
6
+ # modification, are permitted provided that the following conditions are
7
+ # met:
8
+ #
9
+ # * Redistributions of source code must retain the above copyright
10
+ # notice, this list of conditions and the following disclaimer.
11
+ # * Redistributions in binary form must reproduce the above
12
+ # copyright notice, this list of conditions and the following disclaimer
13
+ # in the documentation and/or other materials provided with the
14
+ # distribution.
15
+ # * Neither the name of Google Inc. nor the names of its
16
+ # contributors may be used to endorse or promote products derived from
17
+ # this software without specific prior written permission.
18
+ #
19
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
20
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
21
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
22
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
23
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
24
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
25
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
26
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
27
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
28
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30
+
31
+
32
+ # Protocol Buffers - Google's data interchange format
33
+ # Copyright 2008 Google Inc. All rights reserved.
34
+ # https://developers.google.com/protocol-buffers/
35
+ #
36
+ # Redistribution and use in source and binary forms, with or without
37
+ # modification, are permitted provided that the following conditions are
38
+ # met:
39
+ #
40
+ # * Redistributions of source code must retain the above copyright
41
+ # notice, this list of conditions and the following disclaimer.
42
+ # * Redistributions in binary form must reproduce the above
43
+ # copyright notice, this list of conditions and the following disclaimer
44
+ # in the documentation and/or other materials provided with the
45
+ # distribution.
46
+ # * Neither the name of Google Inc. nor the names of its
47
+ # contributors may be used to endorse or promote products derived from
48
+ # this software without specific prior written permission.
49
+ #
50
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
51
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
52
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
53
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
54
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
55
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
56
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
57
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
58
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
59
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
60
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
61
+
62
+ """Builds descriptors, message classes and services for generated _pb2.py.
63
+
64
+ This file is only called in python generated _pb2.py files. It builds
65
+ descriptors, message classes and services that users can directly use
66
+ in generated code.
67
+ """
68
+
69
+ """Dynamic Protobuf class creator."""
70
+
71
+ __author__ = '[email protected] (Jie Luo)'
72
+
73
+
74
+ from collections import OrderedDict
75
+ import hashlib
76
+ import os
77
+
78
+ from google.protobuf import descriptor_pb2
79
+ from google.protobuf import descriptor
80
+ from google.protobuf import message_factory
81
+
82
+
83
+ from google.protobuf.internal import enum_type_wrapper
84
+ from google.protobuf import message as _message
85
+ from google.protobuf import reflection as _reflection
86
+ from google.protobuf import symbol_database as _symbol_database
87
+
88
+ _sym_db = _symbol_database.Default()
89
+
90
+
91
+ def BuildMessageAndEnumDescriptors(file_des, module):
92
+ """Builds message and enum descriptors.
93
+
94
+ Args:
95
+ file_des: FileDescriptor of the .proto file
96
+ module: Generated _pb2 module
97
+ """
98
+
99
+ def BuildNestedDescriptors(msg_des, prefix):
100
+ for (name, nested_msg) in msg_des.nested_types_by_name.items():
101
+ module_name = prefix + name.upper()
102
+ module[module_name] = nested_msg
103
+ BuildNestedDescriptors(nested_msg, module_name + '_')
104
+ for enum_des in msg_des.enum_types:
105
+ module[prefix + enum_des.name.upper()] = enum_des
106
+
107
+ for (name, msg_des) in file_des.message_types_by_name.items():
108
+ module_name = '_' + name.upper()
109
+ module[module_name] = msg_des
110
+ BuildNestedDescriptors(msg_des, module_name + '_')
111
+
112
+
113
+ def BuildTopDescriptorsAndMessages(file_des, module_name, module):
114
+ """Builds top level descriptors and message classes.
115
+
116
+ Args:
117
+ file_des: FileDescriptor of the .proto file
118
+ module_name: str, the name of generated _pb2 module
119
+ module: Generated _pb2 module
120
+ """
121
+
122
+ def BuildMessage(msg_des):
123
+ create_dict = {}
124
+ for (name, nested_msg) in msg_des.nested_types_by_name.items():
125
+ create_dict[name] = BuildMessage(nested_msg)
126
+ create_dict['DESCRIPTOR'] = msg_des
127
+ create_dict['__module__'] = module_name
128
+ message_class = _reflection.GeneratedProtocolMessageType(
129
+ msg_des.name, (_message.Message,), create_dict)
130
+ _sym_db.RegisterMessage(message_class)
131
+ return message_class
132
+
133
+ # top level enums
134
+ for (name, enum_des) in file_des.enum_types_by_name.items():
135
+ module['_' + name.upper()] = enum_des
136
+ module[name] = enum_type_wrapper.EnumTypeWrapper(enum_des)
137
+ for enum_value in enum_des.values:
138
+ module[enum_value.name] = enum_value.number
139
+
140
+ # top level extensions
141
+ for (name, extension_des) in file_des.extensions_by_name.items():
142
+ module[name.upper() + '_FIELD_NUMBER'] = extension_des.number
143
+ module[name] = extension_des
144
+
145
+ # services
146
+ for (name, service) in file_des.services_by_name.items():
147
+ module['_' + name.upper()] = service
148
+
149
+ # Build messages.
150
+ for (name, msg_des) in file_des.message_types_by_name.items():
151
+ module[name] = BuildMessage(msg_des)
152
+
153
+
154
+ def BuildServices(file_des, module_name, module):
155
+ """Builds services classes and services stub class.
156
+
157
+ Args:
158
+ file_des: FileDescriptor of the .proto file
159
+ module_name: str, the name of generated _pb2 module
160
+ module: Generated _pb2 module
161
+ """
162
+ # pylint: disable=g-import-not-at-top
163
+ from google.protobuf import service as _service
164
+ from google.protobuf import service_reflection
165
+ # pylint: enable=g-import-not-at-top
166
+ for (name, service) in file_des.services_by_name.items():
167
+ module[name] = service_reflection.GeneratedServiceType(
168
+ name, (_service.Service,),
169
+ dict(DESCRIPTOR=service, __module__=module_name))
170
+ stub_name = name + '_Stub'
171
+ module[stub_name] = service_reflection.GeneratedServiceStubType(
172
+ stub_name, (module[name],),
173
+ dict(DESCRIPTOR=service, __module__=module_name))
174
+
175
+
176
+ def _GetMessageFromFactory(factory, full_name):
177
+ """Get a proto class from the MessageFactory by name.
178
+
179
+ Args:
180
+ factory: a MessageFactory instance.
181
+ full_name: str, the fully qualified name of the proto type.
182
+ Returns:
183
+ A class, for the type identified by full_name.
184
+ Raises:
185
+ KeyError, if the proto is not found in the factory's descriptor pool.
186
+ """
187
+ proto_descriptor = factory.pool.FindMessageTypeByName(full_name)
188
+ proto_cls = factory.GetPrototype(proto_descriptor)
189
+ return proto_cls
190
+
191
+
192
+ def MakeSimpleProtoClass(fields, full_name=None, pool=None):
193
+ """Create a Protobuf class whose fields are basic types.
194
+
195
+ Note: this doesn't validate field names!
196
+
197
+ Args:
198
+ fields: dict of {name: field_type} mappings for each field in the proto. If
199
+ this is an OrderedDict the order will be maintained, otherwise the
200
+ fields will be sorted by name.
201
+ full_name: optional str, the fully-qualified name of the proto type.
202
+ pool: optional DescriptorPool instance.
203
+ Returns:
204
+ a class, the new protobuf class with a FileDescriptor.
205
+ """
206
+ factory = message_factory.MessageFactory(pool=pool)
207
+
208
+ if full_name is not None:
209
+ try:
210
+ proto_cls = _GetMessageFromFactory(factory, full_name)
211
+ return proto_cls
212
+ except KeyError:
213
+ # The factory's DescriptorPool doesn't know about this class yet.
214
+ pass
215
+
216
+ # Get a list of (name, field_type) tuples from the fields dict. If fields was
217
+ # an OrderedDict we keep the order, but otherwise we sort the field to ensure
218
+ # consistent ordering.
219
+ field_items = fields.items()
220
+ if not isinstance(fields, OrderedDict):
221
+ field_items = sorted(field_items)
222
+
223
+ # Use a consistent file name that is unlikely to conflict with any imported
224
+ # proto files.
225
+ fields_hash = hashlib.sha1()
226
+ for f_name, f_type in field_items:
227
+ fields_hash.update(f_name.encode('utf-8'))
228
+ fields_hash.update(str(f_type).encode('utf-8'))
229
+ proto_file_name = fields_hash.hexdigest() + '.proto'
230
+
231
+ # If the proto is anonymous, use the same hash to name it.
232
+ if full_name is None:
233
+ full_name = ('net.proto2.python.public.proto_builder.AnonymousProto_' +
234
+ fields_hash.hexdigest())
235
+ try:
236
+ proto_cls = _GetMessageFromFactory(factory, full_name)
237
+ return proto_cls
238
+ except KeyError:
239
+ # The factory's DescriptorPool doesn't know about this class yet.
240
+ pass
241
+
242
+ # This is the first time we see this proto: add a new descriptor to the pool.
243
+ factory.pool.Add(
244
+ _MakeFileDescriptorProto(proto_file_name, full_name, field_items))
245
+ return _GetMessageFromFactory(factory, full_name)
246
+
247
+
248
+ def _MakeFileDescriptorProto(proto_file_name, full_name, field_items):
249
+ """Populate FileDescriptorProto for MessageFactory's DescriptorPool."""
250
+ package, name = full_name.rsplit('.', 1)
251
+ file_proto = descriptor_pb2.FileDescriptorProto()
252
+ file_proto.name = os.path.join(package.replace('.', '/'), proto_file_name)
253
+ file_proto.package = package
254
+ desc_proto = file_proto.message_type.add()
255
+ desc_proto.name = name
256
+ for f_number, (f_name, f_type) in enumerate(field_items, 1):
257
+ field_proto = desc_proto.field.add()
258
+ field_proto.name = f_name
259
+ # # If the number falls in the reserved range, reassign it to the correct
260
+ # # number after the range.
261
+ if f_number >= descriptor.FieldDescriptor.FIRST_RESERVED_FIELD_NUMBER:
262
+ f_number += (
263
+ descriptor.FieldDescriptor.LAST_RESERVED_FIELD_NUMBER -
264
+ descriptor.FieldDescriptor.FIRST_RESERVED_FIELD_NUMBER + 1)
265
+ field_proto.number = f_number
266
+ field_proto.label = descriptor_pb2.FieldDescriptorProto.LABEL_OPTIONAL
267
+ field_proto.type = f_type
268
+ return file_proto
final_models/mobilenet_lumpy_skin_model.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:be6e05f78a51401277f181f787deb16c9eeff9834342f71b090fc368c6782385
3
+ size 14868376
final_models/randomforest_best_model.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cf85c7fc7d48a6a2362baaef7b118a3f56555dbaa7a2bae48f1afca942cef855
3
+ size 9915513
final_models/scaler_object.joblib ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2e7b94e6c71cb971a521dc7e44c4c9ba64a5307c35a4588da5a9515ad6c1c48e
3
+ size 1383
requirements.txt ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ fastapi
2
+ google-generativeai
3
+ gradio
4
+ huggingface_hub[cli]
5
+ matplotlib
6
+ numpy==1.26.4
7
+ pandas
8
+ python-dotenv
9
+ protobuf==3.19.6
10
+ scikit-learn==1.5.2
11
+ scipy
12
+ seaborn
13
+ subprocess.run
14
+ tensorflow==2.10.0
15
+ uvicorn
src/config.py ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from dotenv import load_dotenv
3
+
4
+ load_dotenv()
5
+
6
+ BASE_DIR = os.path.abspath(os.path.join(os.getcwd(), '.'))
7
+ print(BASE_DIR)
8
+
9
+ MODELS_DIR = os.path.join(BASE_DIR, 'final_models')
10
+
11
+ GEMINI_API_KEY = os.getenv('GEMINI_API_KEY')
12
+
13
+ GEMINI_MODEL_NAME = 'gemini-1.5-flash-latest'
src/exception.py ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import traceback
2
+ from .logger import get_logger
3
+
4
+ logger = get_logger(__name__)
5
+
6
+ # Define custom exceptions for different parts of the application
7
+ class ModelLoadingError(Exception):
8
+ """Raised when a model fails to load properly."""
9
+
10
+ class PreprocessingError(Exception):
11
+ """Raised when an error occurs during data preprocessing."""
12
+
13
+ class PredictionError(Exception):
14
+ """Raised when there is an issue with generating predictions from the models."""
15
+
16
+ class APIError(Exception):
17
+ """Raised when there is an issue with the API call to the LLM or any external service."""
18
+
19
+
20
+ def log_exception(e: Exception, custom_message: str = ""):
21
+ """
22
+ Logs detailed information about an exception, including traceback.
23
+
24
+ Args:
25
+ e (Exception): The exception to log.
26
+ custom_message (str): Optional custom message to provide additional context in the log.
27
+ """
28
+ exc_type, exc_value, exc_traceback = e.__class__, e, e.__traceback__
29
+ trace_details = "".join(traceback.format_exception(exc_type, exc_value, exc_traceback))
30
+ logger.error(f"{custom_message}\nException type: {exc_type}\nTraceback:\n{trace_details}")
src/llm.py ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import google.generativeai as genai
2
+ from .exception import log_exception, APIError
3
+ from .config import *
4
+ from .logger import get_logger
5
+
6
+ logger = get_logger(__name__)
7
+
8
+ class LLM:
9
+ def __init__(self):
10
+ try:
11
+ # Configure and initialize the Generative AI model
12
+ genai.configure(api_key=GEMINI_API_KEY)
13
+ self.model = genai.GenerativeModel(GEMINI_MODEL_NAME)
14
+ logger.info("LLM model initialized successfully.")
15
+ except Exception as e:
16
+ log_exception(e, "Failed to initialize the LLM model in LLM class.")
17
+ raise APIError("Error initializing LLM model. Check API key and model name.") from e
18
+
19
+ def prompt_template(self, result):
20
+ # Generates a structured prompt for the LLM based on the provided data
21
+ prompt = f"""You are a medical assistant with expertise in diagnosing and explaining Lumpy Skin Disease in animals. Based on provided data, generate a detailed report covering the following sections:
22
+
23
+ - Prediction Output: State if the case is classified as 'Lumpy' or 'Not Lumpy'.
24
+ - Key Observations: Summarize important symptoms and indicators.
25
+ - Cause Analysis: Explain the main reasons contributing to the prediction.
26
+ - Precautions & Solutions: Suggest any preventive measures and potential treatments for the condition.
27
+
28
+ Instructions: Carefully analyze the provided image, input data, and ML model predictions to generate a clear and comprehensive report.
29
+
30
+ Input Data:
31
+ {result}
32
+
33
+ Output Report:
34
+ - Prediction Output: Provide the final classification as **Lumpy** or **Not Lumpy** based on your analysis.
35
+
36
+ - Key Observations: List the notable symptoms from the image and input data that influenced the classification.
37
+
38
+ - Cause Analysis: Explain the likely cause(s) contributing to this prediction, highlighting specific symptoms or environmental factors.
39
+
40
+ - Precautions & Solutions: Outline preventive measures to avoid the spread of Lumpy Skin Disease, and suggest any possible treatments or care strategies to manage the condition.
41
+
42
+ """
43
+ return prompt
44
+
45
+ def inference(self, image, result):
46
+ try:
47
+ # Prepare and send the request to the LLM model
48
+ refined_prompt = self.prompt_template(result)
49
+ prompt = [{'role': 'user', 'parts': [image, refined_prompt]}]
50
+
51
+ response = self.model.generate_content(prompt)
52
+
53
+ if response.text:
54
+ logger.info("LLM inference successful.")
55
+ return response.text
56
+ else:
57
+ logger.warning("LLM did not return any text.")
58
+ raise APIError("LLM response is empty. Please check input format and prompt.")
59
+
60
+ except Exception as e:
61
+ log_exception(e, "Error during LLM inference in LLM class.")
62
+ raise APIError("Error during LLM inference. Check input data and model configuration.") from e
src/logger.py ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import logging
3
+ from logging.handlers import RotatingFileHandler
4
+
5
+ # Create logs directory if it doesn't exist
6
+ os.makedirs("logs", exist_ok=True)
7
+
8
+ # Configure the logging
9
+ logging.basicConfig(
10
+ level=logging.INFO,
11
+ format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
12
+ handlers=[
13
+ RotatingFileHandler("logs/application.log", maxBytes=5*1024*1024, backupCount=3), # Log rotation
14
+ logging.StreamHandler() # Log to console as well
15
+ ]
16
+ )
17
+
18
+ # Function to get the logger
19
+ def get_logger(name):
20
+ return logging.getLogger(name)
src/predict.py ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import joblib
2
+ import numpy as np
3
+ import tensorflow as tf
4
+ from .llm import LLM
5
+ from .config import *
6
+ from .exception import log_exception, ModelLoadingError, PreprocessingError, PredictionError
7
+ from .logger import get_logger
8
+
9
+ logger = get_logger(__name__)
10
+
11
+ class DataPreprocessor:
12
+ def __init__(self):
13
+ try:
14
+ self.scaler = joblib.load(f"{MODELS_DIR}/scaler_object.joblib")
15
+ logger.info("DataPreprocessor initialized with scaler.")
16
+ except Exception as e:
17
+ log_exception(e, "Error loading scaler in DataPreprocessor.")
18
+ raise ModelLoadingError("Could not load the scaler for data preprocessing.") from e
19
+
20
+ def preprocess(self, input_data):
21
+ try:
22
+ scaled_data = self.scaler.transform(np.array(input_data).reshape(1, -1))
23
+ logger.info("Data preprocessing completed successfully.")
24
+ return scaled_data
25
+ except Exception as e:
26
+ log_exception(e, "Error in preprocessing data in DataPreprocessor.")
27
+ raise PreprocessingError("Preprocessing failed. Ensure input data format is correct.") from e
28
+
29
+
30
+ class ML_Model_Predictor:
31
+ def __init__(self):
32
+ try:
33
+ self.model = joblib.load(f"{MODELS_DIR}/randomforest_best_model.pkl")
34
+ logger.info("ML model loaded successfully.")
35
+ except Exception as e:
36
+ log_exception(e, "Failed to load ML model in ML_Model_Predictor.")
37
+ raise ModelLoadingError("Could not load ML model. Please check model path and format.") from e
38
+
39
+ def predict(self, preprocessed_data):
40
+ try:
41
+ prediction = self.model.predict(preprocessed_data)
42
+ logger.info("ML model prediction completed successfully.")
43
+ return prediction[0]
44
+ except Exception as e:
45
+ log_exception(e, "Error during ML model prediction in ML_Model_Predictor.")
46
+ raise PredictionError("Prediction failed. Ensure input data format is correct.") from e
47
+
48
+
49
+ class CNN_Model_Predictor:
50
+ def __init__(self):
51
+ try:
52
+ self.model = tf.keras.models.load_model(f"{MODELS_DIR}/mobilenet_lumpy_skin_model.h5")
53
+ logger.info("CNN model loaded successfully.")
54
+ except Exception as e:
55
+ log_exception(e, "Failed to load CNN model in CNN_Model_Predictor.")
56
+ raise ModelLoadingError("Could not load CNN model. Please check model path and format.") from e
57
+
58
+ def predict(self, image):
59
+ try:
60
+ image = image.resize((224, 224))
61
+ image_array = np.array(image) / 255.0
62
+ image_array = np.expand_dims(image_array, axis=0)
63
+ image_array = tf.keras.applications.mobilenet_v2.preprocess_input(image_array)
64
+ prediction = self.model.predict(image_array)
65
+ logger.info("CNN model prediction completed successfully.")
66
+ return np.argmax(prediction, axis=1)[0]
67
+ except Exception as e:
68
+ log_exception(e, "Error during CNN model prediction in CNN_Model_Predictor.")
69
+ raise PredictionError("CNN prediction failed. Check image input format.") from e
70
+
71
+
72
+ def prediction(image, longitude, latitude, cloud_cover, evapotranspiration, precipitation, min_temp, mean_temp, max_temp, vapour_pressure, wet_day_freq):
73
+ try:
74
+ # Initialize classes
75
+ preprocessor = DataPreprocessor()
76
+ ml_predictor = ML_Model_Predictor()
77
+ cnn_predictor = CNN_Model_Predictor()
78
+ llm = LLM()
79
+
80
+ # Prepare structured data input for ML model
81
+ structured_data = [longitude, latitude, cloud_cover, evapotranspiration, precipitation, min_temp, mean_temp, max_temp, vapour_pressure, wet_day_freq]
82
+ preprocessed_data = preprocessor.preprocess(structured_data)
83
+
84
+ # Get predictions from ML and CNN models
85
+ ml_prediction = ml_predictor.predict(preprocessed_data)
86
+ cnn_prediction = cnn_predictor.predict(image)
87
+
88
+ result = f"""
89
+ Lumpy Skin Disease Diagnostic Report:
90
+
91
+ **ML Model Prediction:** {'Lumpy' if ml_prediction == 1 else 'Not Lumpy'}
92
+ **CNN Model Prediction:** {'Lumpy' if cnn_prediction == 1 else 'Not Lumpy'}
93
+
94
+ **Input Data:**
95
+ - Longitude: {longitude}
96
+ - Latitude: {latitude}
97
+ - Monthly Cloud Cover: {cloud_cover}
98
+ - Potential EvapoTranspiration: {evapotranspiration}
99
+ - Precipitation: {precipitation}
100
+ - Minimum Temperature: {min_temp}
101
+ - Mean Temperature: {mean_temp}
102
+ - Maximum Temperature: {max_temp}
103
+ - Vapour Pressure: {vapour_pressure}
104
+ - Wet Day Frequency: {wet_day_freq}
105
+ """
106
+
107
+ # Generate LLM report
108
+ report = llm.inference(image=image, result=result)
109
+ logger.info("LLM report generated successfully.")
110
+ return report
111
+
112
+ except Exception as e:
113
+ log_exception(e, "Error in prediction function.")
114
+ raise PredictionError("Prediction function encountered an error. Check inputs and model paths.") from e
src/utils.py ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import site
2
+ import subprocess
3
+ from .config import *
4
+
5
+ def copy_builder():
6
+ try:
7
+ site_packages_path = site.getsitepackages()[0]
8
+ destination_path = os.path.join(site_packages_path, 'google/protobuf/internal/')
9
+ subprocess.run(["cp", "./builder.py", destination_path], shell=True, check=True)
10
+ except Exception as e:
11
+ raise Exception(f"Failed to copy builder.py to site-packages: {e}")
12
+
13
+ css="""
14
+ .container {
15
+ max-width: 90%;
16
+ }
17
+
18
+ .title {
19
+ font-family: 'Trebuchet MS';
20
+ text-align: center;
21
+ font-size: 2em;
22
+ font-weight: bold;
23
+ padding: 10px 0;
24
+ background-color: #1f2121;
25
+ }
26
+
27
+ input, textarea{
28
+ background-color: #18191a;
29
+ }
30
+
31
+ #output-container {
32
+ max-height: 400px;
33
+ overflow-y: auto;
34
+ padding: 10px;
35
+ border: 1px solid #ccc;
36
+ border-radius: 5px;
37
+ }
38
+
39
+ """