Skip to content

Commit ab698b3

Browse files
authored
Merge pull request #164519 from avkewalr-msft/patch-50
Updated as per feedback fixes
2 parents cc29b37 + fc473e5 commit ab698b3

File tree

1 file changed

+5
-24
lines changed

1 file changed

+5
-24
lines changed

articles/azure-video-analyzer/video-analyzer-docs/analyze-live-video-custom-vision.md

Lines changed: 5 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -108,24 +108,9 @@ After you're finished, you can export the model to a Docker container by using t
108108
2. `docker image ls`
109109

110110
This command checks if the new image is in your local registry.
111-
3. `docker run -p 127.0.0.1:80:80 -d cvtruck`
112-
113-
This command should publish the Docker's exposed port (80) onto your local machine's port (80).
114-
4. `docker container ls`
115-
116-
This command checks the port mappings and if the Docker container is running successfully on your machine. The output should be something like:
117-
118-
```
119-
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
120-
8b7505398367 cvtruck "/bin/sh -c 'python …" 13 hours ago Up 25 seconds 127.0.0.1:80->80/tcp practical_cohen
121-
```
122-
5. `curl -X POST http://127.0.0.1:80/score -F imageData=@<path to any image file that has the toy delivery truck in it>`
123-
124-
This command tests the container on the local machine. If the image has the same delivery truck as we trained the model on, the output should be something like the following example. It suggests the delivery truck was detected with 90.12% probability.
125-
126-
```
127-
{"created":"2020-03-20T07:10:47.827673","id":"","iteration":"","predictions":[{"boundingBox":{"height":0.66167289,"left":-0.03923762,"top":0.12781593,"width":0.70003178},"probability":0.90128148,"tagId":0,"tagName":"delivery truck"},{"boundingBox":{"height":0.63733053,"left":0.25220079,"top":0.0876643,"width":0.53331227},"probability":0.59745145,"tagId":0,"tagName":"delivery truck"}],"project":""}
128-
```
111+
112+
## Set up your development environment
113+
[!INCLUDE [setup development environment](./includes/set-up-dev-environment/csharp/csharp-set-up-dev-env.md)]
129114

130115
## Examine the sample files
131116

@@ -184,15 +169,11 @@ After you're finished, you can export the model to a Docker container by using t
184169
- A module named `rtspsim`, which simulates an RTSP server that acts as the source of a live video feed.
185170
- A module named `cv`, which as the name suggests is the Custom Vision toy truck detection model that applies Custom Vision to the images and returns multiple tag types. (Our model was trained on only one tag, delivery truck.)
186171

187-
## Prepare for monitoring events
188-
189-
Right-click the ava-sample-device, and select **Start Monitoring Built-in Event Endpoint**. You need this step to monitor the IoT Hub events in the **OUTPUT** window of Visual Studio Code.
190172

191-
![Screenshot that shows Start Monitoring Built-in Event Endpoint.](./media/custom-vision/start-monitoring.png)
192173

193174
## Run the sample program
194175

195-
If you open the topology for this tutorial in a browser, you'll see that the value of `inferencingUrl` has been set to `http://cv/image`. This setting means the inference server will return results after detecting toy trucks, if any, in the live video.
176+
If you open the topology for this tutorial in a browser, you'll see that the value of `inferencingUrl` has been set to `http://cv/score`. This setting means the inference server will return results after detecting toy trucks, if any, in the live video.
196177

197178
1. In Visual Studio Code, open the **Extensions** tab (or select **Ctrl+Shift+X**) and search for Azure IoT Hub.
198179
2. Right-click and select **Extension Settings**.
@@ -219,7 +200,7 @@ If you open the topology for this tutorial in a browser, you'll see that the val
219200
"parameters": [
220201
{
221202
"name": "inferencingUrl",
222-
"value": "http://cv/image"
203+
"value": "http://cv/score"
223204
},
224205
{
225206
"name": "rtspUrl",

0 commit comments

Comments
 (0)