Skip to content

Commit 09813ce

Browse files
authored
Merge pull request #171 from Unity-Technologies/rename_to_computer_vision
Docs update and polish, a couple of references to Unity Computer Vision
2 parents fa3f4ed + 6de3a02 commit 09813ce

File tree

14 files changed

+129
-108
lines changed

14 files changed

+129
-108
lines changed

README.md

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
> com.unity.perception is in active development. Its features and API are subject to significant change as development progresses.
1010
1111

12-
# Perception
12+
# Perception Package (Unity Computer Vision)
1313

1414
The Perception package provides a toolkit for generating large-scale datasets for perception-based machine learning training and validation. It is focused on a handful of camera-based use cases for now and will ultimately expand to other forms of sensors and machine learning tasks.
1515

@@ -27,9 +27,9 @@ In-depth documentation on inidividual components of the package.
2727
|Feature|Description|
2828
|---|---|
2929
|[Labeling](com.unity.perception/Documentation~/GroundTruthLabeling.md)|A component that marks a GameObject and its descendants with a set of labels|
30-
|[LabelConfig](com.unity.perception/Documentation~/GroundTruthLabeling.md#label-config)|An asset that defines a taxonomy of labels for ground truth generation|
30+
|[Label Config](com.unity.perception/Documentation~/GroundTruthLabeling.md#label-config)|An asset that defines a taxonomy of labels for ground truth generation|
3131
|[Perception Camera](com.unity.perception/Documentation~/PerceptionCamera.md)|Captures RGB images and ground truth from a [Camera](https://docs.unity3d.com/Manual/class-Camera.html).|
32-
|[DatasetCapture](com.unity.perception/Documentation~/DatasetCapture.md)|Ensures sensors are triggered at proper rates and accepts data for the JSON dataset.|
32+
|[Dataset Capture](com.unity.perception/Documentation~/DatasetCapture.md)|Ensures sensors are triggered at proper rates and accepts data for the JSON dataset.|
3333
|[Randomization (Experimental)](com.unity.perception/Documentation~/Randomization/Index.md)|The Randomization tool set lets you integrate domain randomization principles into your simulation.|
3434

3535
## Example Projects
@@ -43,7 +43,7 @@ In-depth documentation on inidividual components of the package.
4343
### Unity Simulation Smart Camera example
4444
<img src="com.unity.perception/Documentation~/images/smartcamera.png"/>
4545

46-
The [Unity Simulation Smart Camera Example](https://github.com/Unity-Technologies/Unity-Simulation-Smart-Camera-Outdoor) illustrates how Perception could be used in a smart city or autonomous vehicle simulation. You can generate datasets locally or at scale in [Unity Simulation](https://unity.com/products/unity-simulation).
46+
The [Unity Simulation Smart Camera Example](https://github.com/Unity-Technologies/Unity-Simulation-Smart-Camera-Outdoor) illustrates how the Perception toolset could be used in a smart city or autonomous vehicle simulation. You can generate datasets locally or at scale in [Unity Simulation](https://unity.com/products/unity-simulation).
4747

4848
## Local development
4949
The repository includes two projects for local development in `TestProjects` folder, one set up for HDRP and the other for URP.
@@ -59,10 +59,16 @@ For closest standards conformity and best experience overall, JetBrains Rider or
5959
## License
6060
* [License](com.unity.perception/LICENSE.md)
6161

62+
## Support
63+
64+
For general questions or concerns please contact the Computer Vision team at [email protected].
65+
66+
For feedback, bugs, or other issues please file a GitHub issue and the Computer Vision team will investigate the issue as soon as possible.
67+
6268
## Citation
6369
If you find this package useful, consider citing it using:
6470
```
65-
@misc{com.unity.perception2020,
71+
@misc{com.unity.perception2021,
6672
title={Unity {P}erception Package},
6773
author={{Unity Technologies}},
6874
howpublished={\url{https://github.com/Unity-Technologies/com.unity.perception}},

com.unity.perception/Documentation~/DatasetCapture.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,12 +4,14 @@
44

55

66
## Sensor scheduling
7-
While sensors are registered, `DatasetCapture` ensures that frame timing is deterministic and run at the appropriate simulation times to let each sensor run at its own rate.
7+
While sensors are registered, `DatasetCapture` ensures that frame timing is deterministic and run at the appropriate simulation times to let each sensor render and capture at its own rate.
88

9-
Using [Time.CaptureDeltaTime](https://docs.unity3d.com/ScriptReference/Time-captureDeltaTime.html), it also decouples wall clock time from simulation time, allowing the simulation to run as fast as possible.
9+
Using [Time.captureDeltaTime](https://docs.unity3d.com/ScriptReference/Time-captureDeltaTime.html), it also decouples wall clock time from simulation time, allowing the simulation to run as fast as possible.
1010

1111
## Custom sensors
12-
You can register custom sensors using `DatasetCapture.RegisterSensor()`. The `period` you pass in at registration time determines how often (in simulation time) frames should be scheduled for the sensor to run. The sensor implementation then checks `ShouldCaptureThisFrame` on the returned `SensorHandle` each frame to determine whether it is time for the sensor to perform a capture. `SensorHandle.ReportCapture` should then be called in each of these frames to report the state of the sensor to populate the dataset.
12+
You can register custom sensors using `DatasetCapture.RegisterSensor()`. The `simulationDeltaTime` you pass in at registration time is used as `Time.captureDeltaTime` and determines how often (in simulation time) frames should be simulated for the sensor to run. This and the `framesBetweenCaptures` value determine at which exact times the sensor should capture the simulated frames. The decoupling of simulation delta time and capture frequency based on frames simulated allows you to render frames in-between captures. If no in-between frames are desired, you can set `framesBetweenCaptures` to 0. When it is time to capture, the `ShouldCaptureThisFrame` check of the `SensorHandle` returns true. `SensorHandle.ReportCapture` should then be called in each of these frames to report the state of the sensor to populate the dataset.
13+
14+
`Time.captureDeltaTime` is set at every frame in order to precisely fall on the next sensor that requires simulation, and this includes multi-sensor simulations. For instance, if one sensor has a `simulationDeltaTime` of 2 and another 3, the first five values for `Time.captureDeltaTime` will be 2, 1, 1, 2, and 3, meaning simulation will happen on the timestamps 0, 2, 3, 4, 6, and 9.
1315

1416
## Custom annotations and metrics
1517
In addition to the common annotations and metrics produced by [PerceptionCamera](PerceptionCamera.md), scripts can produce their own via `DatasetCapture`. You must first register annotation and metric definitions using `DatasetCapture.RegisterAnnotationDefinition()` or `DatasetCapture.RegisterMetricDefinition()`. These return `AnnotationDefinition` and `MetricDefinition` instances which you can then use to report values during runtime.

com.unity.perception/Documentation~/PerceptionCamera.md

Lines changed: 27 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,43 +1,58 @@
1-
# The Perception Camera component
1+
# The Perception Camera Component
22
The Perception Camera component ensures that the [Camera](https://docs.unity3d.com/Manual/class-Camera.html) runs at deterministic rates. It also ensures that the Camera uses [DatasetCapture](DatasetCapture.md) to capture RGB and other Camera-related ground truth in the [JSON dataset](Schema/Synthetic_Dataset_Schema.md). You can use the Perception Camera component on the High Definition Render Pipeline (HDRP) or the Universal Render Pipeline (URP).
33

4-
![Perception Camera component](images/PerceptionCameraFinished.png)
5-
<br/>_Perception Camera component_
4+
<p align="center">
5+
<img src="images/PerceptionCameraFinished.png" width="600"/>
6+
<br><i>The Inspector view of the Perception Camera component</i>
7+
</p>
68

79
## Properties
810
| Property: | Function: |
911
|--|--|
1012
| Description | A description of the Camera to be registered in the JSON dataset. |
11-
| Period | The amount of simulation time in seconds between frames for this Camera. For more information on sensor scheduling, see [DatasetCapture](DatasetCapture.md). |
12-
| Start Time | The simulation time at which to run the first frame. This time offsets the period, which allows multiple Cameras to run at the correct times relative to each other. |
13-
| Capture Rgb Images | When you enable this property, Unity captures RGB images as PNG files in the dataset each frame. |
13+
| Show Visualizations | Display realtime visualizations for labelers that are currently active on this camera. |
14+
| Capture RGB Images | When you enable this property, Unity captures RGB images as PNG files in the dataset each frame. |
15+
| Capture Trigger Mode | The method of triggering captures for this camera. In `Scheduled` mode, captures happen automatically based on a start frame and frame delta time. In `Manual` mode, captures should be triggered manually through calling the `RequestCapture` method of `PerceptionCamera`. |
1416
| Camera Labelers | A list of labelers that generate data derived from this Camera. |
1517

18+
### Properties for Scheduled Capture Mode
19+
| Property: | Function: |
20+
|--|--|
21+
| Simulation Delta Time | The simulation frame time (seconds) for this camera. E.g. 0.0166 translates to 60 frames per second. This will be used as Unity's `Time.captureDeltaTime`, causing a fixed number of frames to be generated for each second of elapsed simulation time regardless of the capabilities of the underlying hardware. For more information on sensor scheduling, see [DatasetCapture](DatasetCapture.md). |
22+
| First Capture Frame | Frame number at which this camera starts capturing. |
23+
| Frames Between Captures | The number of frames to simulate and render between the camera's scheduled captures. Setting this to 0 makes the camera capture every frame. |
24+
25+
### Properties for Manual Capture Mode
26+
| Property: | Function: |
27+
|--|--|
28+
| Affect Simulation Timing | Have this camera affect simulation timings (similar to a scheduled camera) by requesting a specific frame delta time. Enabling this option will let you set the `Simulation Delta Time` property described above.|
29+
30+
1631
## Camera labelers
1732
Camera labelers capture data related to the Camera in the JSON dataset. You can use this data to train models and for dataset statistics. The Perception package provides several Camera labelers, and you can derive from the CameraLabeler class to define more labelers.
1833

19-
### SemanticSegmentationLabeler
34+
### Semantic Segmentation Labeler
2035
![Example semantic segmentation image from a modified SynthDet project](images/semantic_segmentation.png)
2136
<br/>_Example semantic segmentation image from a modified SynthDet project_
2237

2338
The SemanticSegmentationLabeler generates a 2D RGB image with the attached Camera. Unity draws objects in the color you associate with the label in the SemanticSegmentationLabelingConfiguration. If Unity can't find a label for an object, it draws it in black.
2439

25-
### InstanceSegmentationLabeler
40+
### Instance Segmentation Labeler
2641

2742
The instance segmentation labeler generates a 2D RGB image with the attached camera. Unity draws each instance of a labeled
2843
object with a unique color.
2944

30-
### BoundingBox2DLabeler
45+
### Bounding Box 2D Labeler
3146
![Example bounding box visualization from SynthDet generated by the `SynthDet_Statistics` Jupyter notebook](images/bounding_boxes.png)
3247
<br/>_Example bounding box visualization from SynthDet generated by the `SynthDet_Statistics` Jupyter notebook_
3348

3449
The BoundingBox2DLabeler produces 2D bounding boxes for each visible object with a label you define in the IdLabelConfig. Unity calculates bounding boxes using the rendered image, so it only excludes occluded or out-of-frame portions of the objects.
3550

3651
### Bounding Box 3D Ground Truth Labeler
3752

38-
The Bounding Box 3D Ground Truth Labeler prouces 3D ground truth bounding boxes for each labeled game object in the scene. Unlike the 2D bounding boxes, 3D bounding boxes are calculated from the labeled meshes in the scene and all objects (independent of their occlusion state) are recorded.
53+
The Bounding Box 3D Ground Truth Labeler produces 3D ground truth bounding boxes for each labeled game object in the scene. Unlike the 2D bounding boxes, 3D bounding boxes are calculated from the labeled meshes in the scene and all objects (independent of their occlusion state) are recorded.
3954

40-
### ObjectCountLabeler
55+
### Object Count Labeler
4156

4257
```
4358
{
@@ -50,7 +65,7 @@ _Example object count for a single label_
5065

5166
The ObjectCountLabeler records object counts for each label you define in the IdLabelConfig. Unity only records objects that have at least one visible pixel in the Camera frame.
5267

53-
### RenderedObjectInfoLabeler
68+
### Rendered Object Info Labeler
5469
```
5570
{
5671
"label_id": 24,

0 commit comments

Comments
 (0)