File size: 11,847 Bytes
94fc71e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49e447c
 
 
70802d0
49e447c
70802d0
94fc71e
49e447c
 
 
94fc71e
68f71c4
94fc71e
70802d0
 
 
94fc71e
49e447c
 
70802d0
 
 
 
49e447c
 
 
 
 
 
 
 
70802d0
 
 
94fc71e
 
 
 
 
 
 
 
 
 
70802d0
 
 
 
 
 
 
94fc71e
 
 
 
 
70802d0
 
 
 
 
 
 
94fc71e
70802d0
 
 
 
 
 
 
 
 
94fc71e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
70802d0
 
94fc71e
70802d0
 
 
 
 
 
 
94fc71e
 
 
 
 
 
 
70802d0
 
94fc71e
70802d0
94fc71e
 
 
 
70802d0
49e447c
94fc71e
49e447c
70802d0
94fc71e
 
 
 
 
 
49e447c
 
 
 
 
 
 
70802d0
 
 
 
 
 
 
 
 
 
 
68f71c4
 
94fc71e
 
68f71c4
94fc71e
49e447c
 
 
94fc71e
49e447c
 
70802d0
 
49e447c
 
 
 
 
94fc71e
 
 
49e447c
 
 
f6a6373
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
I understand your concern. You're expecting a real-time DRS (Decision Review System) output similar to what you see in international T20 cricket matches, with precise ball tracking, pitch analysis, and wicket-hitting prediction, as used by third umpires. The current output—while functional—falls short of that professional standard in terms of accuracy, visualization, and real-time capability. Let’s break down the gap between your expectations and the current output, then outline how to achieve a more professional DRS-like system.

Understanding Your Expectation
In international T20 matches, the DRS (e.g., Hawk-Eye) provides:

Real-Time Ball Tracking:
High-speed cameras (e.g., 6–8 cameras at 500+ fps) track the ball from the bowler’s release to impact.
A smooth 3D trajectory is shown, often as a curved path, accounting for swing, spin, and bounce.
Pitch Analysis:
The system identifies the exact pitching point (where the ball bounces) and checks if it’s in line with the stumps (inside, outside leg, or outside off).
A 2D or 3D pitch map shows the pitching point relative to the stumps.
Wicket-Hitting Prediction:
The system predicts the ball’s path after impact (e.g., hitting the pad) to determine if it would hit the stumps.
A 3D visualization shows the projected path, often with a “stump cam” view indicating whether the ball would hit, miss, or clip the stumps.
Detailed Decision:
The DRS provides a step-by-step breakdown: pitching (inline/outside), impact (inline/outside), and wicket-hitting (yes/no).
A final decision (“Out” or “Not Out”) is shown with high accuracy (99.9% in professional systems).
Professional Visualization:
Smooth animations of the ball’s path (curved, not linear).
3D renderings of the pitch, stumps, and trajectory.
Clear annotations (e.g., “Ball Pitching: Inline,” “Impact: Inline,” “Wickets: Hitting”).
The current app output, while a starting point, has these shortcomings:

Simplified Tracking: It uses basic OpenCV to detect the ball in a pre-uploaded video, not real-time camera feeds.
Linear Trajectories: The red (actual) and blue (projected) paths are linear, not curved, and don’t account for swing or spin.
Basic Pitch Analysis: Pitching and impact points are estimated roughly, not precisely detected.
2D Visualization: The visualization is a flat 2D canvas, lacking the 3D depth and smoothness of professional DRS.
Dummy ML Model: The LBW decision uses a simplistic logistic regression model, not a robust ML system trained on real cricket data.
Not Real-Time: The app processes uploaded videos, not live camera feeds.
How to Achieve a Real-Time DRS-Like System
To meet your expectations, we need to upgrade the app to handle real-time ball tracking, accurate pitch analysis, wicket-hitting prediction, and professional-grade visualization. Here’s a step-by-step plan, followed by updated code to move closer to your goal.

Key Requirements for a DRS-Like System
Real-Time Camera Integration:
Use multiple high-speed cameras (e.g., 120–240 fps) to capture the ball’s movement live.
Process camera feeds in real-time to track the ball.
Note: Hugging Face Spaces cannot access cameras (no webcam support), so this must be done locally or on a server with camera access.
Advanced Ball Tracking:
Use computer vision (OpenCV) to detect the ball in each frame.
Apply trajectory smoothing (e.g., spline interpolation) to create a curved path.
Detect swing (lateral deviation) and spin (rotation).
Pitch and Impact Detection:
Identify the pitching point by detecting the ball’s bounce (sudden change in y-coordinate).
Detect the impact point by identifying when the ball stops (e.g., hits the pad, often with a sudden slowdown).
Use pitch markings (e.g., creases, stumps) to determine if pitching/impact is inline.
Wicket-Hitting Prediction:
Model the ball’s physics (swing, spin, bounce) to predict the post-impact path.
Use an ML model to refine predictions based on historical data.
Professional Visualization:
Use a 3D rendering library (e.g., Three.js) for realistic trajectory visualization.
Show a detailed breakdown (pitching, impact, wicket-hitting) with annotations.
Robust ML Model:
Train a deep learning model (e.g., CNN) on real cricket video data to predict LBW outcomes.
Host the model on Hugging Face Model Hub for inference.
Challenges
Hardware: Professional DRS uses 6–8 high-speed cameras ($100,000+ setup). For a playground, 2–4 consumer cameras (e.g., GoPro, $500–$2,000 each) can work but reduce accuracy.
Real-Time Processing: Processing multiple camera feeds in real-time requires significant compute power (e.g., GPU server). Hugging Face Spaces (free tier: 2 vCPUs, 8GB RAM) can’t handle this; you’ll need a local setup or cloud server (e.g., AWS).
Data: Training an ML model for LBW requires labeled cricket video data (scarce publicly). You may need to collect and annotate your own dataset.
Visualization: 3D rendering (e.g., with Three.js) is more complex than the current 2D Canvas and requires additional setup.
Plan to Achieve DRS-Like Output
Since real-time camera integration and 3D visualization are complex, let’s break this into phases. For now, we’ll enhance the app to:

Improve ball tracking with smoother trajectories.
Add detailed LBW analysis (pitching, impact, wicket-hitting breakdown).
Upgrade visualization to show a more professional 2D output (we’ll add 3D later).
Provide guidance for real-time setup (local or cloud).
Updated Code
app.py
This version improves ball tracking, adds detailed LBW analysis, and prepares for real-time integration.

python

Collapse

Wrap

Run

Copy
from flask import Flask, render_template, request, jsonify
import numpy as np
from sklearn.linear_model import LogisticRegression
import cv2
import os
from werkzeug.utils import secure_filename
from scipy.interpolate import splprep, splev  # For smooth trajectory

app = Flask(__name__)

# Configure upload folder
UPLOAD_FOLDER = '/tmp/uploads'
os.makedirs(UPLOAD_FOLDER, exist_ok=True)
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
ALLOWED_EXTENSIONS = {'mp4', 'avi', 'mov'}

# Dummy ML model for LBW decision (to be replaced with a real model)
def train_dummy_model():
    X = np.array([
        [0.5, 0.0, 0.4, 0.5, 30, 0],  # Not Out
        [0.5, 0.5, 0.5, 0.5, 35, 2],  # Out
        [0.6, 0.2, 0.5, 0.6, 32, 1],  # Not Out
        [0.5, 0.4, 0.5, 0.4, 34, 0],  # Out
    ])
    y = np.array([0, 1, 0, 1])
    model = LogisticRegression()
    model.fit(X, y)
    return model

model = train_dummy_model()

def allowed_file(filename):
    return '.' in filename and filename.rsplit('.', 1)[1].lower() in ALLOWED_EXTENSIONS

def smooth_trajectory(points):
    if len(points) < 3:
        return points
    x = [p["x"] for p in points]
    y = [p["y"] for p in points]
    tck, u = splprep([x, y], s=0)
    u_new = np.linspace(0, 1, 50)  # Smooth with 50 points
    x_new, y_new = splev(u_new, tck)
    return [{"x": x, "y": y} for x, y in zip(x_new, y_new)]

def process_video(video_path):
    cap = cv2.VideoCapture(video_path)
    if not cap.isOpened():
        return None, None, "Failed to open video"

    actual_path = []
    frame_count = 0
    spin = 0
    last_point = None
    pitching_detected = False
    impact_detected = False
    y_positions = []

    while cap.isOpened():
        ret, frame = cap.read()
        if not ret:
            break

        hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
        mask = cv2.inRange(hsv, (0, 120, 70), (10, 255, 255))  # Adjust for your ball color
        contours, _ = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)

        if contours:
            c = max(contours, key=cv2.contourArea)
            x, y, w, h = cv2.boundingRect(c)
            center_x = x + w / 2
            center_y = y + h / 2
            norm_x = center_x / 1280
            norm_y = center_y / 720
            current_point = (norm_x, norm_y)

            if last_point != current_point:
                actual_path.append({"x": norm_x, "y": norm_y})
                y_positions.append(norm_y)
                last_point = current_point

            # Detect pitching (first significant downward movement)
            if len(y_positions) > 2 and not pitching_detected:
                if y_positions[-1] < y_positions[-2] and y_positions[-2] < y_positions[-3]:
                    pitching_detected = True
                    pitching_x = actual_path[-2]["x"]
                    pitching_y = actual_path[-2]["y"]

            # Detect impact (sudden slowdown or stop)
            if len(actual_path) > 2 and not impact_detected:
                speed_current = abs(y_positions[-1] - y_positions[-2])
                speed_prev = abs(y_positions[-2] - y_positions[-3])
                if speed_current < speed_prev * 0.3:  # Significant slowdown
                    impact_detected = True
                    impact_x = actual_path[-1]["x"]
                    impact_y = actual_path[-1]["y"]

        frame_count += 1
        if frame_count > 50:  # Process more frames for accuracy
            break

    cap.release()

    if not actual_path:
        return None, None, "No ball detected in video"

    if not pitching_detected:
        pitching_x = actual_path[len(actual_path)//2]["x"]
        pitching_y = actual_path[len(actual_path)//2]["y"]

    if not impact_detected:
        impact_x = actual_path[-1]["x"]
        impact_y = actual_path[-1]["y"]

    fps = cap.get(cv2.CAP_PROP_FPS) or 30
    speed = (len(actual_path) / (frame_count / fps)) * 0.5

    # Smooth the actual path
    actual_path = smooth_trajectory(actual_path)

    # Projected path with basic physics (linear for now, add swing/spin later)
    projected_path = [
        {"x": impact_x, "y": impact_y},
        {"x": impact_x + spin * 0.1, "y": 1.0}
    ]

    # Determine pitching and impact status
    pitching_status = "Inline" if 0.4 <= pitching_x <= 0.6 else "Outside Leg" if pitching_x < 0.4 else "Outside Off"
    impact_status = "Inline" if 0.4 <= impact_x <= 0.6 else "Outside"
    wicket_status = "Hitting" if 0.4 <= projected_path[-1]["x"] <= 0.6 else "Missing"

    return actual_path, projected_path, pitching_x, pitching_y, impact_x, impact_y, speed, spin, pitching_status, impact_status, wicket_status

@app.route('/')
def index():
    return render_template('index.html')

@app.route('/analyze', methods=['POST'])
def analyze():
    if 'video' not in request.files:
        return jsonify({'error': 'No video uploaded'}), 400

    file = request.files['video']
    if file.filename == '' or not allowed_file(file.filename):
        return jsonify({'error': 'Invalid file'}), 400

    filename = secure_filename(file.filename)
    video_path = os.path.join(app.config['UPLOAD_FOLDER'], filename)
    file.save(video_path)

    result = process_video(video_path)
    if result[0] is None:
        os.remove(video_path)
        return jsonify({'error': result[2]}), 400

    actual_path, projected_path, pitching_x, pitching_y, impact_x, impact_y, speed, spin, pitching_status, impact_status, wicket_status = result

    features = np.array([[pitching_x, pitching_y, impact_x, impact_y, speed, spin]])
    prediction = model.predict(features)[0]
    confidence = min(model.predict_proba(features)[0][prediction], 0.99)
    decision = "Out" if prediction == 1 else "Not Out"

    os.remove(video_path)

    return jsonify({
        'actual_path': actual_path,
        'projected_path': projected_path,
        'decision': decision,
        'confidence': round(confidence, 2),
        'pitching': {'x': pitching_x, 'y': pitching_y, 'status': pitching_status},
        'impact': {'x': impact_x, 'y': impact_y, 'status': impact_status},
        'wicket': wicket_status
    })

if __name__ == '__main__':
    app.run(host='0.0.0.0', port=7860, debug=True)