Skip to content

Commit da0eb52

Browse files
authored
Merge pull request #298795 from dominicbetts/release-2504-aio-onvif-events
AIO 2504: Add details of ONVIF events configuration
2 parents 0e7353c + 1b36c97 commit da0eb52

File tree

1 file changed

+136
-6
lines changed

1 file changed

+136
-6
lines changed

articles/iot-operations/discover-manage-assets/howto-use-onvif-connector.md

Lines changed: 136 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: dominicbetts
55
ms.author: dobett
66
ms.service: azure-iot-operations
77
ms.topic: how-to
8-
ms.date: 11/06/2024
8+
ms.date: 04/24/2025
99

1010
#CustomerIntent: As an industrial edge IT or operations user, I want to configure the connector for ONVIF so that I can read and write camera settings to control an ONVIF compliant camera.
1111
---
@@ -139,15 +139,34 @@ authentication: {
139139

140140
After you create the asset endpoint, the connector for ONVIF runs a discovery process to detect the capabilities of the connected camera. The results of the discovery process are **DiscoveredAsset** and **DiscoverAssetEndpointProfile** custom resources:
141141

142-
- A **DiscoveredAsset** custom resource represents one of the [ONVIF services](https://www.onvif.org/profiles/specifications/) such as pan-tilt-zoom (PTZ) that the camera supports.
142+
- A **DiscoveredAsset** custom resource represents one of the [ONVIF services](https://www.onvif.org/profiles/specifications/) such as pan-tilt-zoom (PTZ) that the camera supports. The output from `kubectl get discoveredassets -n azure-iot-operations` might look like the following example:
143143

144-
- A **DiscoveredAssetEndpointProfile** custom resource represents a video stream format that the camera exposes.
144+
```output
145+
NAME AGE
146+
contoso-onvif-aep-device 3m
147+
contoso-onvif-aep-media 3m
148+
contoso-onvif-aep-ptz 3m
149+
```
150+
151+
- A **DiscoveredAssetEndpointProfile** custom resource represents a video stream format that the camera exposes. The output from `kubectl get discoveredassetendpointprofiles -n azure-iot-operations` might look like the following example:
152+
153+
```output
154+
NAME AGE
155+
contoso-onvif-aep-mainstream-http 3m
156+
contoso-onvif-aep-mainstream-rtsp 3m
157+
contoso-onvif-aep-mainstream-tcp 3m
158+
contoso-onvif-aep-mainstream-udp 3m
159+
contoso-onvif-aep-minorstream-http 3m
160+
contoso-onvif-aep-minorstream-rtsp 3m
161+
contoso-onvif-aep-minorstream-tcp 3m
162+
contoso-onvif-aep-minorstream-udp 3m
163+
```
145164

146165
Currently, during public preview, you must manually create the **Asset** and **AssetEndpointProfile** custom resources that represent the capabilities of the camera and its video streams.
147166

148167
### Access the PTZ capabilities of the camera
149168

150-
Use the PTZ capabilities of an ONVIF compliant camera to control its position and orientation.To manually create an asset that represents the PTZ capabilities of the camera discovered previously:
169+
Use the PTZ capabilities of an ONVIF compliant camera to control its position and orientation. To manually create an asset that represents the PTZ capabilities of the camera discovered previously:
151170

152171
# [Bash](#tab/bash)
153172

@@ -203,7 +222,7 @@ The following snippet shows the bicep file that you used to create the asset. Th
203222

204223
### Access the media capabilities of the camera
205224

206-
To use the PTZ capabilities of an ONVIF-complian camera, you need a profile token from the camera's media service. To manually create an asset that represents the media capabilities of the camera discovered previously:
225+
To manually create an asset that represents the media capabilities of the camera discovered previously:
207226

208227
# [Bash](#tab/bash)
209228

@@ -257,6 +276,117 @@ The following snippet shows the bicep file that you used to create the asset. Th
257276

258277
:::code language="bicep" source="~/azure-iot-operations-samples/samples/onvif-connector-bicep/asset-onvif-media.bicep":::
259278

279+
### Receive events from the camera
280+
281+
The camera can send notifications such as motion detected events to the Azure IoT Operations cluster. The connector for ONVIF subscribes to the camera's event service and publishes the events to the Azure IoT Operations MQTT broker.
282+
283+
To find the events that camera can send, use the following command to view the description of the **DiscoveredAsset** custom resource that represents the camera. The discovered asset that lists the supported events has a `-device` suffix to the asset name:
284+
285+
```bash
286+
kubectl describe discoveredasset your-discovered-asset-device -n azure-iot-operations
287+
```
288+
289+
The output from the previous command includes a `Spec:` section that looks like the following example:
290+
291+
```output
292+
Spec:
293+
Asset Endpoint Profile Ref: your-asset-endpoint-profile-aep
294+
Datasets:
295+
Default Datasets Configuration:
296+
Default Events Configuration:
297+
Default Topic:
298+
Path:
299+
Retain: Never
300+
Discovery Id: a00b978b9d971450fa6378900b164736170bd2d790a2061da94a2238adee0d4f
301+
Documentation Uri:
302+
Events:
303+
Event Configuration:
304+
Event Notifier: tns1:RuleEngine/CellMotionDetector/Motion
305+
Last Updated On: 2025-04-23T15:48:21.585502872+00:00
306+
Name: tns1:RuleEngine/CellMotionDetector/Motion
307+
Topic:
308+
Path:
309+
Retain: Never
310+
Event Configuration:
311+
Event Notifier: tns1:RuleEngine/TamperDetector/Tamper
312+
Last Updated On: 2025-04-23T15:48:21.585506712+00:00
313+
Name: tns1:RuleEngine/TamperDetector/Tamper
314+
Topic:
315+
Path:
316+
Retain: Never
317+
```
318+
319+
During public preview, you must manually add an asset definition based on the information in the discovered asset. To manually create an asset that represents the media capabilities of the camera discovered previously:
320+
321+
# [Bash](#tab/bash)
322+
323+
1. Set the following environment variables:
324+
325+
```bash
326+
SUBSCRIPTION_ID="<YOUR SUBSCRIPTION ID>"
327+
RESOURCE_GROUP="<YOUR AZURE IOT OPERATIONS RESOURCE GROUP>"
328+
AEP_NAME="contoso-onvif-aep"
329+
```
330+
331+
1. Run the following script:
332+
333+
```bash
334+
# Download the Bicep file
335+
wget https://raw.githubusercontent.com/Azure-Samples/explore-iot-operations/main/samples/onvif-connector-bicep/asset-onvif-device.bicep -O asset-onvif-device.bicep
336+
337+
# Find the name of your custom location
338+
CUSTOM_LOCATION_NAME=$(az iot ops list -g $RESOURCE_GROUP --query "[0].extendedLocation.name" -o tsv)
339+
340+
# Use the Bicep file to deploy the asset
341+
az deployment group create --subscription $SUBSCRIPTION_ID --resource-group $RESOURCE_GROUP --template-file asset-onvif-device.bicep --parameters customLocationName=$CUSTOM_LOCATION_NAME aepName=$AEP_NAME
342+
```
343+
344+
# [PowerShell](#tab/powershell)
345+
346+
1. Set the following environment variables:
347+
348+
```powershell
349+
$SUBSCRIPTION_ID="<YOUR SUBSCRIPTION ID>"
350+
$RESOURCE_GROUP="<YOUR AZURE IOT OPERATIONS RESOURCE GROUP>"
351+
$AEP_NAME="contoso-onvif-aep"
352+
```
353+
354+
1. Run the following script:
355+
356+
```powershell
357+
# Download the Bicep file
358+
Invoke-WebRequest -Uri https://raw.githubusercontent.com/Azure-Samples/explore-iot-operations/main/samples/onvif-connector-bicep/asset-onvif-device.bicep -OutFile asset-onvif-device.bicep
359+
360+
# Find the name of your custom location
361+
$CUSTOM_LOCATION_NAME = (az iot ops list -g $RESOURCE_GROUP --query "[0].extendedLocation.name" -o tsv)
362+
363+
# Use the Bicep file to deploy the asset
364+
az deployment group create --subscription $SUBSCRIPTION_ID --resource-group $RESOURCE_GROUP --template-file asset-onvif-device.bicep --parameters customLocationName=$CUSTOM_LOCATION_NAME aepName=$AEP_NAME
365+
```
366+
367+
---
368+
369+
The following snippet shows the bicep file that you used to create the asset. The `-device` suffix to the asset name is a required convention to indicate that the asset represents the device capabilities of the camera:
370+
371+
:::code language="bicep" source="~/azure-iot-operations-samples/samples/onvif-connector-bicep/asset-onvif-device.bicep":::
372+
373+
The connector for ONVIF now receives notifications of motion detected events from the camera and publishes them to the `data/camera-device` topic in the MQTT broker:
374+
375+
```output
376+
{
377+
"name": "motionDetected",
378+
"eventNotifier": "tns1:RuleEngine/CellMotionDetector/Motion",
379+
"source": {
380+
"VideoSourceConfigurationToken": "vsconf",
381+
"VideoAnalyticsConfigurationToken": "VideoAnalyticsToken",
382+
"Rule": "MyMotionDetectorRule"
383+
},
384+
"data": {
385+
"IsMotion": "true"
386+
}
387+
}
388+
```
389+
260390
## Manage and control the camera
261391

262392
To interact with the ONVIF camera, you can publish MQTT messages that the connector for ONVIF subscribes to. The message format is based on the [ONVIF network interface specifications](https://www.onvif.org/profiles/specifications/).
@@ -274,4 +404,4 @@ To manually create an asset endpoint and asset that enable access to the video s
274404

275405
1. During public preview, first use a tool to discover the RTSP stream URLs of the camera.
276406

277-
1. Use the RTSP stream URL to create the asset endpoint and asset. To lean more, see [Configure the media connector (preview)](howto-use-media-connector.md).
407+
1. Use the RTSP stream URL to create the asset endpoint and asset. To learn more, see [Configure the media connector (preview)](howto-use-media-connector.md).

0 commit comments

Comments
 (0)