Niharmahesh commited on
Commit
dda3ce3
·
verified ·
1 Parent(s): 80beea9

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +151 -111
app.py CHANGED
@@ -77,17 +77,17 @@ def display_projects():
77
 
78
  # Define tab titles
79
  tab_titles = [
80
- "Squat Easy",
81
- "ASL Translator",
82
- "Face Recognition",
83
- "Stock Market Chatbot",
84
- "Twitter Trend Analysis",
85
- "Restaurant Recommendation System",
86
- "Bitcoin Lightning Path Optimization",
87
- "National Infrastructure Monitoring",
88
  "Job Easz",
89
- "Prompt Easz",
90
- "Resume_Easz"
 
 
 
 
 
91
  ]
92
 
93
  # Create tabs
@@ -95,145 +95,185 @@ def display_projects():
95
 
96
  # Add content to each tab
97
  with tabs[0]:
98
- st.header("Squat Easy")
99
  st.markdown("""
100
- - **Description**: Engineered a custom BiLSTM architecture in PyTorch with extensive hyperparameter tuning, achieving 81% training and 75% test accuracy in classifying six types of squatting errors from video data. Optimized through data augmentation and CUDA-based GPU acceleration.
101
- - **Technologies Used**: PyTorch, Object-Oriented Programming (OOP), GitHub
102
- - **Reference**: [Link to Project](https://github.com/niharpalem/squateasy_DL)
 
 
 
 
 
 
 
 
 
103
  """)
104
 
105
  with tabs[1]:
106
- st.header("ASL Translator")
107
  st.markdown("""
108
- - **Description**: Crafted a ASL translation system using MediaPipe for point detection and trained a Random Forest Model , achieving 95% accuracy in real-time gesture interpretation. Implemented an adaptive hand skeleton GIF generator for intuitive visual representation.
109
- - **Technologies Used**: MediaPipe Hand Detection, Hugging Face Platform
110
- - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/slr-easz)
 
 
 
 
 
 
 
 
111
  """)
112
 
113
  with tabs[2]:
114
- st.header("Face Recognition")
115
  st.markdown("""
116
- - **Description**: Integrated fiducial point features, CNN-extracted image features, and Siamese networks in TensorFlow, attaining 85% test accuracy for facial recognition. Optimized for real-world security applications by balancing computational efficiency with accuracy.
117
- - **Technologies Used**: TensorFlow, Siamese Networks
118
- - **Reference**: [Link to Project](#)
 
 
 
 
 
 
 
 
 
119
  """)
120
 
121
  with tabs[3]:
122
- st.header("Stock Market Chatbot")
123
  st.markdown("""
124
- - **Description**: Architected a multilingual stock analysis system using Apache Spark and a custom LLM, boosting query performance by 25% over traditional SQL approaches. Interfaced with Snowflake for efficient financial data retrieval and real-time insights in English and Chinese.
125
- - **Technologies Used**: PySpark for Querying, Data Warehousing with Redshift and Snowflake
126
- - **Reference**: [Link to Project](#)
 
 
 
 
 
 
 
 
 
127
  """)
128
 
129
  with tabs[4]:
130
- st.header("Twitter Trend Analysis")
131
  st.markdown("""
132
- - **Description**: Engineered an ELT pipeline using GCP's BigQuery for Twitter data processing and sentiment analysis. Integrated Tableau for live, interactive dashboards, showcasing advanced cloud data engineering skills and cost-effective data storage solutions.
133
- - **Technologies Used**: Apache Airflow for Automation, Docker, Tableau for Dashboards
134
- - **Reference**: [Link to Project](#)
 
 
 
 
 
 
 
135
  """)
136
 
137
  with tabs[5]:
138
- st.header("Restaurant Recommendation System")
139
  st.markdown("""
140
- - **Description**: Engineered a recommendation engine using collaborative and content-based filtering, achieving a 15% accuracy increase. Constructed a Flask web app with Folium integration for an interactive, location-based restaurant suggestion interface.
141
- - **Technologies Used**: Collaborative Filtering, Content-Based Filtering
142
- - **Reference**: [Link to Project](#)
 
 
 
 
 
 
 
 
 
143
  """)
144
 
145
  with tabs[6]:
146
- st.header("Bitcoin Lightning Path Optimization")
147
  st.markdown("""
148
- - **Description**: Implemented a graph-based algorithm to optimize payment routing in the Bitcoin Lightning Network. Created a simulation environment to validate improved efficiency in multi-channel transactions under various network conditions.
149
- - **Technologies Used**: Graph-Based Algorithms, Simulation Environments
150
- - **Reference**: [Link to Project](#)
 
 
 
 
 
 
 
 
151
  """)
152
 
153
  with tabs[7]:
154
- st.header("National Infrastructure Monitoring")
155
  st.markdown("""
156
- - **Description**: Utilized satellite imagery to detect changes between different time frames by fine-tuning a Change ViT model. Developed a UI where users can draw bounding boxes on a Python map library; these coordinates are used in Google Earth Engine (GEE) to extract Sentinel-2 imagery. Users can select the resolution of images for caching. The processing function includes contrast adjustments and automatic image chipping as the model requires 256x256 inputs, generating change masks effectively.
157
- - **Technologies Used**: Satellite Imagery Analysis, Change ViT Model, Google Earth Engine (GEE), Python Map Libraries
158
- - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/Data298)
 
 
 
 
 
 
 
 
159
  """)
 
160
  with tabs[8]:
161
- st.header("JobEasz: Data Job Search Platform")
162
  st.markdown("""
163
- - **Description**: Developed a comprehensive job aggregation platform for data-related roles, leveraging advanced data engineering techniques to streamline the job search process. The platform unifies listings from major job sites, provides daily updates, and offers real-time market trend visualizations.
164
- - **Technologies Used**: Python for Web Scraping, PyArrow & Pandas for Data Processing, Hugging Face Datasets for Storage, Streamlit for Frontend, Plotly for Visualization
 
 
 
165
  - **Key Features**:
166
- Daily updates with ~3000 jobs across various data roles in the US
167
- Custom filtering by role and location
168
- Direct application links to original postings
169
- Automated ETL pipeline for data extraction and processing
170
- • Cloud-based storage for scalability and performance
171
- - **Data Engineering Highlights**:
172
- • Efficient data parsing and processing using PyArrow
173
- • Scalable data storage and retrieval with Hugging Face Datasets
174
- • Real-time data transformation and interactive visualizations
175
- - **Skills**: Web Scraping, Python, Streamlit, Hugging Face, ETL, Data Cleaning, Dashboard Building
176
- - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/job_easz)
177
  """)
 
178
  with tabs[9]:
179
- st.header("Prompt Easz")
180
  st.markdown("""
181
- - **Description**: Developed a prompt converter using sentence transformers to tailor prompts based on predefined styles (e.g., ELI5, chain of thoughts, role play). The tool aims to make learning new topics less overwhelming by providing customized prompts.
182
- - **Technologies Used**: Sentence Transformers, Cosine Similarity, Vector Embeddings, Google's all-MiniLM-L6-v2 model
183
- - **Key Features**:
184
- - Converts user prompts to vector embeddings
185
- - Suggests updated prompts based on similarity to predefined styles
186
- - Offers 10 main prompt types with 3-4 variations each
187
- - Covers 10 predefined categories
188
- - Allows users to customize prompt strength and style
189
- - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/PromptEasz)
190
- """)
 
 
 
 
191
  with tabs[10]:
192
- st.header("Resume Easz")
193
  st.markdown("""
194
- **Resume Easz**: An LLM-powered application for resume analysis and enhancement
195
-
196
- Resume Easz is an innovative AI-driven tool that analyzes and enhances resumes based on job descriptions. Leveraging the power of prompt engineering and the GROQ API, it utilizes the versatile Llama 3.3 model to provide comprehensive resume analysis and improvement.
197
-
198
- **Key Features**:
199
- - Upload DOCX resumes for analysis
200
- - Input job descriptions for targeted optimization
201
- - Quick or in-depth analysis options
202
- - AI-enhanced resume generation
203
- - Multiple output formats (DOCX, HTML, TXT)
204
-
205
- **Analysis Types**:
206
- 1. **Quick Analysis**:
207
- - Skills Match: Top 5 required skills with proficiency ratings
208
- - Experience Alignment: Comparison of required vs. demonstrated experience
209
- - Pros and Cons: Top 3 strengths and areas for improvement
210
- - Match Percentage: Overall compatibility score with explanation
211
- 2. **In-Depth Analysis**:
212
- - Comprehensive Skill Gap Analysis
213
- - Detailed Experience and Impact Analysis
214
- - Content Enhancement Recommendations
215
- - Strategic Recommendations for Competitive Edge
216
- - Application Strategy Suggestions
217
- 3. **Resume Enhancement**:
218
- - Optimizes content based on analysis
219
- - Improves formatting and structure
220
- - Highlights key achievements and skills
221
- - Ensures ATS compatibility
222
-
223
- **User Experience**:
224
- - Intuitive Streamlit interface
225
- - Visual diff to highlight resume changes
226
- - Multiple download options (DOCX, HTML, TXT)
227
-
228
- **Limitations**:
229
- - GROQ API token limit (100,000 tokens per model)
230
- - Potential wait times for API rate limits
231
-
232
- This powerful tool streamlines the job application process by providing tailored resume optimization, increasing candidates' chances of success in their job search.
233
- - **Reference**: [Link to Project](https://resume-easz.streamlit.app/)
234
- """)
235
-
236
-
237
  def display_skills():
238
  st.markdown('## Skills')
239
  st.write("""
 
77
 
78
  # Define tab titles
79
  tab_titles = [
80
+ "Resume & CV Crafter",
81
+ "Multi-Agent Job Search",
82
+ "Resume Easz",
 
 
 
 
 
83
  "Job Easz",
84
+ "Bitcoin Lightning Optimization",
85
+ "National Infrastructure Monitoring",
86
+ "Stock Market Analysis",
87
+ "Twitter Trend Analysis",
88
+ "Restaurant Recommendation",
89
+ "ASL Translator",
90
+ "Squat Easy"
91
  ]
92
 
93
  # Create tabs
 
95
 
96
  # Add content to each tab
97
  with tabs[0]:
98
+ st.header("LLM-powered Resume & CV Crafter")
99
  st.markdown("""
100
+ - **Description**: Developed AI platform combining LLaMA-3 70B and Deepseek R1 with low-temperature settings for stable, tailored resume and CV generation
101
+ - **Key Features**:
102
+ Smart Matching Algorithm analyzing profiles against job requirements
103
+ • LaTeX-Powered Resumes with professional formatting
104
+ • Automated 4-paragraph Cover Letter Generation
105
+ • Performance Metrics evaluating match quality
106
+ - **Technical Achievements**:
107
+ • Implemented dual-agent architecture: LLaMA-3 8B for profile analysis and 70B for LaTeX generation
108
+ • Engineered JSON schema validation system for error-free template integration
109
+ • Achieved 5,000+ LinkedIn impressions with 80% reduction in creation time
110
+ - **Technologies**: Streamlit, GROQ API (LLaMA-3 70B), LaTeX, JSON Schema
111
+ - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/Resume_and_CV_crafter)
112
  """)
113
 
114
  with tabs[1]:
115
+ st.header("Multi-Agent Job Search System")
116
  st.markdown("""
117
+ - **Description**: Built an AI-powered job search assistant using dual-LLaMA architecture for comprehensive job matching and analysis
118
+ - **Key Features**:
119
+ • Real-time scraping across LinkedIn, Glassdoor, Indeed, ZipRecruiter
120
+ • Advanced resume parsing and job matching
121
+ • Intelligent compatibility scoring system
122
+ - **Technical Achievements**:
123
+ • Developed batch processing pipeline handling 60+ positions/search
124
+ • Reduced job search time by 80% through accurate matching
125
+ • Implemented specialized agents for input processing, scraping, and analysis
126
+ - **Technologies**: GROQ API, jobspy, Streamlit, Pandas, LLMOps
127
+ - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/Multi_Agent_Job_search_and_match)
128
  """)
129
 
130
  with tabs[2]:
131
+ st.header("Resume Easz")
132
  st.markdown("""
133
+ - **Description**: Created an AI-driven resume analysis and enhancement tool using LLaMA 3.3 model
134
+ - **Key Features**:
135
+ • Quick and in-depth resume analysis options
136
+ • Comprehensive skill gap analysis
137
+ • ATS compatibility optimization
138
+ • Multiple output formats (DOCX, HTML, TXT)
139
+ - **Technical Implementation**:
140
+ • Integrated GROQ API for advanced language processing
141
+ • Built visual diff system for resume changes
142
+ • Developed custom prompt engineering pipeline
143
+ - **Technologies**: GROQ API, Streamlit, Python, LLM
144
+ - **Reference**: [Link to Project](https://resume-easz.streamlit.app/)
145
  """)
146
 
147
  with tabs[3]:
148
+ st.header("Job Easz")
149
  st.markdown("""
150
+ - **Description**: Engineered comprehensive job aggregation platform for data roles with advanced analytics
151
+ - **Technical Achievements**:
152
+ Designed Airflow pipeline with exponential backoff retry (120-480s intervals)
153
+ • Optimized concurrent processing reducing runtime from 2h to 40min
154
+ • Processes ~3000 daily job listings across various data roles
155
+ - **Key Features**:
156
+ • Daily updates with comprehensive job role coverage
157
+ • Custom filtering by role and location
158
+ • Interactive dashboard for market trends
159
+ • Automated ETL pipeline
160
+ - **Technologies**: Python, Airflow, ThreadPoolExecutor, Hugging Face Datasets
161
+ - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/job_easz)
162
  """)
163
 
164
  with tabs[4]:
165
+ st.header("Bitcoin Lightning Path Optimization")
166
  st.markdown("""
167
+ - **Description**: Advanced payment routing optimization system for Bitcoin Lightning Network
168
+ - **Technical Achievements**:
169
+ Developed ML classifiers achieving 98.77-99.10% accuracy
170
+ • Implemented tri-model consensus system for optimal routing
171
+ • Engineered ensemble models with 0.98 F1-scores
172
+ - **Implementation Details**:
173
+ • Created simulation environment for multi-channel transactions
174
+ • Optimized graph-based algorithms for payment routing
175
+ • Integrated with Lightning payment interceptor
176
+ - **Technologies**: XGBoost, Random Forest, AdaBoost, Graph Algorithms
177
  """)
178
 
179
  with tabs[5]:
180
+ st.header("National Infrastructure Monitoring")
181
  st.markdown("""
182
+ - **Description**: Developed satellite imagery analysis system for infrastructure change detection
183
+ - **Technical Achievements**:
184
+ • Fine-tuned ViT+ResNet-101 ensemble on 40GB satellite dataset
185
+ • Achieved 85% accuracy in change detection
186
+ • Implemented 8 parallel GPU threads for enhanced performance
187
+ - **Key Features**:
188
+ • Temporal analysis with 1km resolution
189
+ • Interactive map interface with bounding box selection
190
+ • Automatic image chipping for 256x256 inputs
191
+ • Contrast adjustment optimization
192
+ - **Technologies**: Change ViT Model, Google Earth Engine, PyTorch, Computer Vision
193
+ - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/Data298)
194
  """)
195
 
196
  with tabs[6]:
197
+ st.header("Stock Market Analysis with OpenAI Integration")
198
  st.markdown("""
199
+ - **Description**: Created comprehensive stock market analysis system with multilingual capabilities
200
+ - **Technical Achievements**:
201
+ Built Spark streaming pipeline with 30% efficiency improvement
202
+ • Orchestrated Airflow Docker pipeline for Snowflake integration
203
+ • Developed bilingual GPT-3.5 chatbot for SQL query generation
204
+ - **Key Features**:
205
+ • Real-time financial metric calculations
206
+ • Custom indicator generation
207
+ • Multilingual query support
208
+ • Automated data warehousing
209
+ - **Technologies**: PySpark, Apache Airflow, Snowflake, OpenAI GPT-3.5
210
  """)
211
 
212
  with tabs[7]:
213
+ st.header("Twitter Trend Analysis")
214
  st.markdown("""
215
+ - **Description**: Engineered comprehensive Twitter analytics platform using GCP services
216
+ - **Technical Achievements**:
217
+ Developed GCP pipeline processing 40k tweets
218
+ • Achieved 40% efficiency improvement through custom Airflow operators
219
+ • Implemented real-time trend analysis algorithms
220
+ - **Key Features**:
221
+ • Automated ETL workflows
222
+ • Interactive Tableau dashboards
223
+ • Viral metrics tracking
224
+ • Engagement rate calculations
225
+ - **Technologies**: Google Cloud Platform, BigQuery, Apache Airflow, Tableau
226
  """)
227
+
228
  with tabs[8]:
229
+ st.header("Restaurant Recommendation System")
230
  st.markdown("""
231
+ - **Description**: Built hybrid recommendation system combining multiple filtering approaches
232
+ - **Technical Achievements**:
233
+ • Created hybrid TF-IDF and SVD-based filtering system
234
+ • Achieved 43% improvement in recommendation relevance
235
+ • Reduced computation time by 65%
236
  - **Key Features**:
237
+ Location-based suggestions
238
+ Personalized recommendations
239
+ Interactive web interface
240
+ Efficient matrix factorization
241
+ - **Technologies**: Collaborative Filtering, Content-Based Filtering, Flask, Folium
 
 
 
 
 
 
242
  """)
243
+
244
  with tabs[9]:
245
+ st.header("ASL Translator")
246
  st.markdown("""
247
+ - **Description**: Developed real-time American Sign Language translation system
248
+ - **Technical Achievements**:
249
+ • Achieved 95% accuracy in real-time gesture interpretation
250
+ Implemented adaptive hand skeleton GIF generator
251
+ Optimized MediaPipe integration for point detection
252
+ - **Key Features**:
253
+ • Real-time hand tracking
254
+ Visual feedback system
255
+ Intuitive gesture recognition
256
+ • Accessible interface
257
+ - **Technologies**: MediaPipe Hand Detection, Random Forest, Hugging Face Platform
258
+ - **Reference**: [Link to Project](https://huggingface.co/spaces/Niharmahesh/slr-easz)
259
+ """)
260
+
261
  with tabs[10]:
262
+ st.header("Squat Easy")
263
  st.markdown("""
264
+ - **Description**: Developed deep learning system for squat form analysis and error detection
265
+ - **Technical Achievements**:
266
+ Engineered custom BiLSTM architecture in PyTorch
267
+ • Achieved 81% training and 75% test accuracy
268
+ Implemented CUDA-based GPU acceleration
269
+ - **Key Features**:
270
+ • Real-time form analysis
271
+ Six-type error classification
272
+ Video processing pipeline
273
+ Performance optimization
274
+ - **Technologies**: PyTorch, BiLSTM, CUDA, Object-Oriented Programming
275
+ - **Reference**: [Link to Project](https://github.com/niharpalem/squateasy_DL)
276
+ """)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
277
  def display_skills():
278
  st.markdown('## Skills')
279
  st.write("""