Spaces:
Sleeping
Sleeping
File size: 3,550 Bytes
e5b7fab a851b62 137b0e5 23749ff 137b0e5 8f2a3ef e5b7fab 23749ff e5b7fab 23749ff e5b7fab 23749ff c7763c3 23749ff c7763c3 23749ff a13dc9f c7763c3 23749ff c7763c3 23749ff a13dc9f 23749ff a13dc9f 23749ff a13dc9f 23749ff a13dc9f 23749ff |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 |
---
title: box-metrics
tags:
- evaluate
- metric
description: >-
built upon yolov5 iou functions. Outputs metrics regarding box fit
sdk: gradio
sdk_version: 3.19.1
app_file: app.py
pinned: false
emoji: 🕵️
---
# SEA-AI/det-metrics
This hugging face metric uses `seametrics.detection.PrecisionRecallF1Support` under the hood to compute coco-like metrics for object detection tasks. It is a [modified cocoeval.py](https://github.com/SEA-AI/seametrics/blob/develop/seametrics/detection/cocoeval.py) wrapped inside [torchmetrics' mAP metric](https://lightning.ai/docs/torchmetrics/stable/detection/mean_average_precision.html) but with numpy arrays instead of torch tensors.
## Getting Started
To get started with det-metrics, make sure you have the necessary dependencies installed. This metric relies on the `evaluate` and `seametrics` libraries for metric calculation and integration with FiftyOne datasets.
### Installation
First, ensure you have Python 3.8 or later installed. Then, install det-metrics using pip:
```sh
pip install evaluate git+https://github.com/SEA-AI/seametrics@develop
```
### Basic Usage
Here's how to quickly evaluate your object detection models using SEA-AI/box-metrics:
```python
import evaluate
# Define your predictions and references (dict values can also by numpy arrays)
predictions = {
"model1": [torch.tensor[n,6], torch.tensor[n,6]],
"model2": [torch.tensor[n,6], torch.tensor[n,6]]
}
#predictions box format: x1, y1, x2, y2, conf, label (torch metrics format)
references = [torch.tensor[n,5], torch.tensor[n,5]]
#refernces box format: label, x1, y1, x2, y2 (torch metrics format)
# Load SEA-AI/det-metrics and evaluate
module = evaluate.load("SEA-AI/box-metrics")
module.add_batch(prediction=predictions, reference=references, sequence_name="sequence")
results = module.compute()
print(results)
```
This will output the evaluation metrics for your detection model.
```
{'sequence': {'model1':
{'iou': '0.6',
'bep': 0.5,
...
}}}
```
## FiftyOne Integration
Integrate SEA-AI/det-metrics with FiftyOne datasets for enhanced analysis and visualization:
```python
import evaluate
import logging
from seametrics.payload.processor import PayloadProcessor
logging.basicConfig(level=logging.WARNING)
# Configure your dataset and model details
processor = PayloadProcessor(
dataset_name="SENTRY_VIDEOS_DATASET_QA",
gt_field="ground_truth_det",
models=["ahoy-IR-b2-whales__XAVIER-AGX-JP46_CNN"],
sequence_list=["Sentry_2022_11_PROACT_CELADON_7.5M_MOB_2022_11_25_12_12_39"],
data_type="thermal",
)
# Evaluate using SEA-AI/det-metrics
module = evaluate.load("SEA-AI/box-metrics")
module.add_payload(processor.payload)
results = module.compute()
print(results)
```
```console
{'Sentry_2022_11_PROACT_CELADON_7.5M_MOB_2022_11_25_12_12_39': {'ahoy-IR-b2-whales__XAVIER-AGX-JP46_CNN':
{'iou': '0.6',
'bep': 0.5,
...
}}}
```
## Further References
- **seametrics Library**: Explore the [seametrics GitHub repository](https://github.com/SEA-AI/seametrics/tree/main) for more details on the underlying library.
- **Understanding Metrics**: For a deeper understanding of precision, recall, and other metrics, read [this comprehensive guide](https://www.analyticsvidhya.com/blog/2020/09/precision-recall-machine-learning/).
## Contribution
Your contributions are welcome! If you'd like to improve SEA-AI/det-metrics or add new features, please feel free to fork the repository, make your changes, and submit a pull request. |