box-metrics / README.md
kevinconka's picture
A newer version of the Gradio SDK is available: 5.6.0
81773be verified
---
title: box-metrics
tags:
- evaluate
- metric
description: built upon yolov5 iou functions. Outputs metrics regarding box fit
sdk: gradio
sdk_version: 5.6.0
app_file: app.py
pinned: false
emoji: 🕵️
---
# SEA-AI/det-metrics
This hugging face metric uses `seametrics.detection.PrecisionRecallF1Support` under the hood to compute coco-like metrics for object detection tasks. It is a [modified cocoeval.py](https://github.com/SEA-AI/seametrics/blob/develop/seametrics/detection/cocoeval.py) wrapped inside [torchmetrics' mAP metric](https://lightning.ai/docs/torchmetrics/stable/detection/mean_average_precision.html) but with numpy arrays instead of torch tensors.
## Getting Started
To get started with det-metrics, make sure you have the necessary dependencies installed. This metric relies on the `evaluate` and `seametrics` libraries for metric calculation and integration with FiftyOne datasets.
### Installation
First, ensure you have Python 3.8 or later installed. Then, install det-metrics using pip:
```sh
pip install evaluate git+https://github.com/SEA-AI/seametrics@develop
```
### Basic Usage
Here's how to quickly evaluate your object detection models using SEA-AI/box-metrics:
```python
import evaluate
# Define your predictions and references (dict values can also by numpy arrays)
predictions = {
"model1": [torch.tensor[n,6], torch.tensor[n,6]],
"model2": [torch.tensor[n,6], torch.tensor[n,6]]
}
#predictions box format: x1, y1, x2, y2, conf, label (torch metrics format)
references = [torch.tensor[n,5], torch.tensor[n,5]]
#refernces box format: label, x1, y1, x2, y2 (torch metrics format)
# Load SEA-AI/det-metrics and evaluate
module = evaluate.load("SEA-AI/box-metrics")
module.add_batch(prediction=predictions, reference=references, sequence_name="sequence")
results = module.compute()
print(results)
```
This will output the evaluation metrics for your detection model.
```
{'sequence': {'model1':
{'iou': '0.6',
'bep': 0.5,
...
}}}
```
## FiftyOne Integration
Integrate SEA-AI/det-metrics with FiftyOne datasets for enhanced analysis and visualization:
```python
import evaluate
import logging
from seametrics.payload.processor import PayloadProcessor
logging.basicConfig(level=logging.WARNING)
# Configure your dataset and model details
processor = PayloadProcessor(
dataset_name="SENTRY_VIDEOS_DATASET_QA",
gt_field="ground_truth_det",
models=["ahoy-IR-b2-whales__XAVIER-AGX-JP46_CNN"],
sequence_list=["Sentry_2022_11_PROACT_CELADON_7.5M_MOB_2022_11_25_12_12_39"],
data_type="thermal",
)
# Evaluate using SEA-AI/det-metrics
module = evaluate.load("SEA-AI/box-metrics")
module.add_payload(processor.payload)
results = module.compute()
print(results)
```
```console
{'Sentry_2022_11_PROACT_CELADON_7.5M_MOB_2022_11_25_12_12_39': {'ahoy-IR-b2-whales__XAVIER-AGX-JP46_CNN':
{'iou': '0.6',
'bep': 0.5,
...
}}}
```
## Further References
- **seametrics Library**: Explore the [seametrics GitHub repository](https://github.com/SEA-AI/seametrics/tree/main) for more details on the underlying library.
- **Understanding Metrics**: For a deeper understanding of precision, recall, and other metrics, read [this comprehensive guide](https://www.analyticsvidhya.com/blog/2020/09/precision-recall-machine-learning/).
## Contribution
Your contributions are welcome! If you'd like to improve SEA-AI/det-metrics or add new features, please feel free to fork the repository, make your changes, and submit a pull request.