|
1 | 1 | # Sample of Using Batch |
2 | 2 |
|
3 | | -## Overview |
4 | | - |
5 | | -This sample tells how to bind and evaluate batches of input in WinML |
6 | | - |
7 | | -## Requirements |
8 | | - |
9 | | -- [Visual Studio 2017 - 15.4 or higher](https://developer.microsoft.com/en-us/windows/downloads) |
10 | | -- [Windows 10 - Build 18362 or higher](https://www.microsoft.com/en-us/software-download/windowsinsiderpreviewiso) |
11 | | -- [Windows SDK - Build 18362 or higher](https://www.microsoft.com/en-us/software-download/windowsinsiderpreviewSDK) |
12 | | -- Visual Studio Extension for C++/WinRT |
13 | | - |
14 | | - Do the following to add the C++/WinRT extension in Visual Studio. |
15 | | - 1. Go to **Tools > Extensions and Updates**. |
16 | | - 2. Select **Online** in the left pane and search for "WinRT" using the search box. |
17 | | - 3. Select the **C++/WinRT** extension, click **Download**, and close Visual Studio. The extension should install automatically. |
18 | | - 4. When the extension has finished installing, re-open Visual Studio. |
19 | | - |
20 | | - |
21 | | -## Build the sample |
22 | | - |
23 | | -1. If you download the samples ZIP, be sure to unzip the entire archive, not just the folder with the sample you want to build. |
24 | | -2. Start Microsoft Visual Studio 2017 and select **File > Open > Project/Solution**. |
25 | | -3. Starting in the folder where you unzipped the samples, go to the **Samples** subfolder, then the subfolder for this specific sample (**Sample\BatchSupport**). Double-click the Visual Studio solution file (BatchSupport.sln). |
26 | | -4. Confirm that the project is pointed to the correct SDK that you installed (e.g. 18362). You can do this by right-clicking the project in the **Solution Explorer**, selecting **Properties**, and modifying the **Windows SDK Version**. |
27 | | -5. Confirm that you are set for the right configuration and platform (for example: Debug, x64). |
28 | | -6. Build the solution (**Ctrl+Shift+B**). |
29 | | - |
30 | | -## Run the sample |
31 | | - |
32 | | -1. Open a Command Prompt (in the Windows 10 search bar, type **cmd** and press **Enter**). |
33 | | -2. Change the current folder to the folder containing the built EXE (`cd <path-to-exe>`). |
34 | | -3. Run the executable as shown below. Make sure to replace the install location with what matches yours: |
35 | | - ``` |
36 | | - BatchSupport.exe [fixedBatchSize|freeBatchSize] [TensorFloat|VideoFrame] |
37 | | - ``` |
38 | | -4. You should get output similar to the following: |
39 | | - ``` |
40 | | - Loading modelfile 'E:\xianz\Windows-Machine-Learning\Samples\BatchSupport\x64\Debug\SqueezeNet.onnx' on the CPU |
41 | | - model file loaded in 906 ticks |
42 | | - Binding... |
43 | | - Running the model... |
44 | | - model run took 31 ticks |
45 | | - Result for No.0 input |
46 | | - tench, Tinca tinca with confidence of 0.738503 |
47 | | - Result for No.1 input |
48 | | - tabby, tabby cat with confidence of 0.931461 |
49 | | - Result for No.2 input |
50 | | - tench, Tinca tinca with confidence of 0.738503 |
51 | | - ``` |
52 | | -
|
53 | | -## Prepare Model |
54 | | -1. Download [WinMLDashboard](https://github.com/microsoft/Windows-Machine-Learning/releases/tag/v0.6.1) |
55 | | -2. change the batch dimension of model input and output to a fixed number or -1 (free dimension) |
56 | | - <img src='./forReadMe/fixBatchSize.png' width=400 /> <img src='./forReadMe/freeBatchSize.png' width=400 /> |
57 | | -
|
58 | | -## Create Session and Bind Inputs |
59 | | -Take binding batches of VideoFrame as example: |
60 | | -
|
61 | | -### 1. Create Session: |
62 | | -1.1 for fixed batch dimension: |
63 | | - ```C++ |
64 | | - //create a session and binding |
65 | | - LearningModelDeviceKind deviceKind = LearningModelDeviceKind::Cpu; |
66 | | - LearningModelSession session(model, LearningModelDevice(deviceKind)); |
67 | | - LearningModelBinding binding(session); |
68 | | - ``` |
69 | | - |
70 | | -1.2 for free-dimensional batch size: |
71 | | - ```C++ |
72 | | - // create a session and binding |
73 | | - LearningModelDeviceKind deviceKind = LearningModelDeviceKind::Cpu; |
74 | | - LearningModelSessionOptions options; |
75 | | - if ("freeBatchSize" == modelType) { |
76 | | - // If the model has free dimentional batch, override the free dimension with batch_size |
77 | | - options.BatchSizeOverride(static_cast<uint32_t>(BATCH_SIZE)); |
78 | | - } |
79 | | - LearningModelSession session(model, LearningModelDevice(deviceKind), options); |
80 | | - LearningModelBinding binding(session); |
81 | | - ``` |
82 | | -
|
83 | | -### 2. Bind Inputs |
84 | | -```C++ |
85 | | - // Create Batches of VideoFrame |
86 | | - std::vector<VideoFrame> inputVideoFrames = {}; |
87 | | - for (hstring imageName : imageNames) { |
88 | | - auto imagePath = static_cast<hstring>(GetModulePath().c_str()) + imageName; |
89 | | - auto imageFrame = LoadImageFile(imagePath); |
90 | | - inputVideoFrames.emplace_back(imageFrame); |
91 | | - } |
92 | | - auto videoFrames = winrt::single_threaded_vector(std::move(inputFrames)); |
93 | | -
|
94 | | - // Bind |
95 | | - binding.Bind(inputFeatureDescriptor.Current().Name(), inputVideoFrames); |
96 | | -``` |
97 | | - |
98 | | -### 3. Bind Outputs(optional) |
99 | | - |
100 | | -The sample does not bind the output, but you could also bind the output as below: |
101 | | -```C++ |
102 | | - auto outputShape = std::vector<int64_t>{BATCH_SIZE, 1000, 1, 1}; |
103 | | - auto outputValue = TensorFloat::Create(outputShape); |
104 | | - std::wstring outputDataBindingName = |
105 | | - std::wstring(model.OutputFeatures().First().Current().Name()); |
106 | | - binding.Bind(outputDataBindingName, outputValue); |
107 | | - SampleHelper::PrintResults(outputValue.GetAsVectorView()); // Print Results |
108 | | -``` |
| 3 | +This sample is currently deprecated. To learn how to perform batched inference with Windows ML please refer to the [Batching Sample](https://github.com/microsoft/Windows-Machine-Learning/tree/master/Samples/WinMLSamplesGallery/WinMLSamplesGallery/Samples/Batching) in the Windows ML Samples Gallery. |
0 commit comments