|
9 | 9 |
|
10 | 10 | *DetectionMetrics* is a toolkit designed to unify and streamline the evaluation of perception models across different frameworks and datasets. Looking for our published ***DetectionMetrics v1***? Check out all the [relevant links](#v1) below. |
11 | 11 |
|
12 | | -Now, we're excited to introduce ***DetectionMetrics v2***! While retaining the flexibility of our previous release, *DetectionMetrics* has been redesigned with an expanded focus on image and LiDAR segmentation. As we move forward, *v2* will be the actively maintained version, featuring continued updates and enhancements to keep pace with evolving AI and computer vision technologies. |
| 12 | +Now, we're excited to introduce ***DetectionMetrics v2***! While retaining the flexibility of our previous release, *DetectionMetrics* has been redesigned with an expanded focus on image and LiDAR segmentation, and now includes **image object detection** capabilities. As we move forward, *v2* will be the actively maintained version, featuring continued updates and enhancements to keep pace with evolving AI and computer vision technologies. |
13 | 13 |
|
14 | 14 | <table style='font-size:100%; margin: auto;'> |
15 | 15 | <tr> |
16 | 16 | <th>💻 <a href="https://github.com/JdeRobot/DetectionMetrics">Code</a></th> |
17 | 17 | <th>🔧 <a href="https://jderobot.github.io/DetectionMetrics/v2/installation">Installation</a></th> |
18 | 18 | <th>🧩 <a href="https://jderobot.github.io/DetectionMetrics/v2/compatibility">Compatibility</a></th> |
19 | 19 | <th>📖 <a href="https://jderobot.github.io/DetectionMetrics/py_docs/_build/html/index.html">Docs</a></th> |
| 20 | + <th>💻 <a href="https://jderobot.github.io/DetectionMetrics/v2/gui">GUI</a></th> |
20 | 21 | </tr> |
21 | 22 | </table> |
22 | 23 |
|
@@ -45,8 +46,8 @@ Now, we're excited to introduce ***DetectionMetrics v2***! While retaining the f |
45 | 46 | <tr> |
46 | 47 | <td>Object detection</td> |
47 | 48 | <td>Image</td> |
48 | | - <td>Check <a href="https://jderobot.github.io/DetectionMetrics/v1"><i>DetectionMetrics v1</i></a></td> |
49 | | - <td>Check <a href="https://jderobot.github.io/DetectionMetrics/v1"><i>DetectionMetrics v1</i></a></td> |
| 49 | + <td>COCO, custom formats</td> |
| 50 | + <td>PyTorch</td> |
50 | 51 | </tr> |
51 | 52 | </tbody> |
52 | 53 | </table> |
@@ -94,16 +95,44 @@ Install your deep learning framework of preference in your environment. We have |
94 | 95 | If you are using LiDAR, Open3D currently requires `torch==2.2*`. |
95 | 96 |
|
96 | 97 | # Usage |
97 | | -As of now, *DetectionMetrics* can either be used as a Python library or as a command-line application. |
| 98 | +DetectionMetrics can be used in three ways: through the **interactive GUI** (detection only), as a **Python library**, or via the **command-line interface** (segmentation and detection). |
98 | 99 |
|
99 | | -### Library |
| 100 | +## Interactive GUI |
| 101 | +The easiest way to get started with DetectionMetrics is through the GUI (detection tasks only): |
| 102 | + |
| 103 | +```bash |
| 104 | +# From the project root directory |
| 105 | +streamlit run app.py |
| 106 | +``` |
| 107 | + |
| 108 | +The GUI provides: |
| 109 | +- **Dataset Viewer**: Browse and visualize your datasets |
| 110 | +- **Inference**: Run real-time inference on images |
| 111 | +- **Evaluator**: Perform comprehensive model evaluation |
| 112 | + |
| 113 | +For detailed GUI documentation, see our [GUI guide](https://jderobot.github.io/DetectionMetrics/v2/gui). |
| 114 | + |
| 115 | +## Library |
100 | 116 |
|
101 | 117 | 🧑🏫️ [Image Segmentation Tutorial](https://github.com/JdeRobot/DetectionMetrics/blob/master/examples/tutorial_image_segmentation.ipynb) |
102 | 118 |
|
| 119 | +🧑🏫️ [Image Detection Tutorial](https://github.com/JdeRobot/DetectionMetrics/blob/master/examples/tutorial_image_detection.ipynb) |
| 120 | + |
103 | 121 | You can check the `examples` directory for further inspiration. If you are using *poetry*, you can run the scripts provided either by activating the created environment using `poetry shell` or directly running `poetry run python examples/<some_python_script.py>`. |
104 | 122 |
|
105 | | -### Command-line interface |
106 | | -DetectionMetrics currently provides a CLI with two commands, `dm_evaluate` and `dm_batch`. Thanks to the configuration in the `pyproject.toml` file, we can simply run `poetry install` from the root directory and use them without explicitly invoking the Python files. More details are provided in [DetectionMetrics website](https://jderobot.github.io/DetectionMetrics/v2/usage/#command-line-interface). |
| 123 | +## Command-line interface |
| 124 | +DetectionMetrics provides a CLI with two commands, `dm_evaluate` and `dm_batch`. Thanks to the configuration in the `pyproject.toml` file, we can simply run `poetry install` from the root directory and use them without explicitly invoking the Python files. More details are provided in [DetectionMetrics website](https://jderobot.github.io/DetectionMetrics/v2/usage/#command-line-interface). |
| 125 | + |
| 126 | +### Example Usage |
| 127 | +**Segmentation:** |
| 128 | +```bash |
| 129 | +dm_evaluate segmentation image --model_format torch --model /path/to/model.pt --model_ontology /path/to/ontology.json --model_cfg /path/to/cfg.json --dataset_format rellis3d --dataset_dir /path/to/dataset --dataset_ontology /path/to/ontology.json --out_fname /path/to/results.csv |
| 130 | +``` |
| 131 | + |
| 132 | +**Detection:** |
| 133 | +```bash |
| 134 | +dm_evaluate detection image --model_format torch --model /path/to/model.pt --model_ontology /path/to/ontology.json --model_cfg /path/to/cfg.json --dataset_format coco --dataset_dir /path/to/coco/dataset --out_fname /path/to/results.csv |
| 135 | +``` |
107 | 136 |
|
108 | 137 | <h1 id="v1">DetectionMetrics v1</h1> |
109 | 138 |
|
|
0 commit comments