You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
while (!isProcessorRunning.get()); //Wait for Change Feed processor start
112
115
```
113
-
116
+
---
117
+
114
118
```"SampleHost_1"``` is the name of the Change Feed processor worker. ```changeFeedProcessorInstance.start()``` is what actually starts the Change Feed processor.
115
119
116
120
Return to the Azure Portal Data Explorer in your browser. Under the **InventoryContainer-leases** container, click **items** to see its contents. You will see that Change Feed Processor has populated the lease container, i.e. the processor has assigned the ```SampleHost_1``` worker a lease on some partitions of the **InventoryContainer**.
@@ -119,6 +123,8 @@ mvn clean package
119
123
120
124
1. Press enter again in the terminal. This will trigger 10 documents to be inserted into **InventoryContainer**. Each document insertion appears in the Change Feed as JSON; the following callback code handles these events by mirroring the JSON documents into a materialized view:
121
125
126
+
# [Java SDK 4.0](#tab/v4sdk)
127
+
122
128
**Java SDK 4.0**
123
129
```java
124
130
public static ChangeFeedProcessor getChangeFeedProcessor(String hostName, CosmosAsyncContainer feedContainer, CosmosAsyncContainer leaseContainer) {
@@ -145,6 +151,8 @@ mvn clean package
145
151
}
146
152
```
147
153
154
+
# [Java SDK 3.7.0](#tab/v3sdk)
155
+
148
156
**Java SDK 3.7.0**
149
157
```java
150
158
public static ChangeFeedProcessor getChangeFeedProcessor(String hostName, CosmosContainer feedContainer, CosmosContainer leaseContainer) {
@@ -170,6 +178,7 @@ mvn clean package
170
178
typeContainer.upsertItem(document).subscribe();
171
179
}
172
180
```
181
+
---
173
182
174
183
1. Allow the code to run 5-10sec. Then return to the Azure Portal Data Explorer and navigate to **InventoryContainer > items**. You should see that items are being inserted into the inventory container; note the partition key (```id```).
175
184
@@ -185,6 +194,8 @@ mvn clean package
185
194
186
195
Hit enter again to call the function ```deleteDocument()``` in the example code. This function, shown below, upserts a new version of the document with ```/ttl == 5```, which sets document Time-To-Live (TTL) to 5sec.
The Change Feed ```feedPollDelay``` is set to 100ms; therefore, Change Feed responds to this update almost instantly and calls ```updateInventoryTypeMaterializedView()``` shown above. That last functioncall will upsert the new document with TTL of 5sec into **InventoryContainer-pktype**.
This step is optional. If you're interested in learning how the database resources are created in the code, you can review the following snippets. Otherwise, you can skip ahead to [Run the app
77
77
](#run-the-app).
78
78
79
+
80
+
# [Sync API](#tab/sync)
81
+
79
82
### Managing database resources using the synchronous (sync) API
80
83
81
84
*`CosmosClient` initialization. The `CosmosClient` provides client-side logical representation for the Azure Cosmos database service. This client is used to configure and execute requests against the service.
@@ -102,6 +105,8 @@ This step is optional. If you're interested in learning how the database resourc
### Managing database resources using the asynchronous (async) API
106
111
107
112
* Async API calls return immediately, without waiting for a response from the server. In light of this, the following code snippets show proper design patterns for accomplishing all of the preceding management tasks using async API.
@@ -130,6 +135,8 @@ This step is optional. If you're interested in learning how the database resourc
Now go back to the Azure portal to get your connection string information and launch the app with your endpoint information. This enables your app to communicate with your hosted database.
Data is central to machine learning pipelines. This article provides code for importing, transforming, and moving data between steps in an Azure Machine Learning pipeline. For an overview of how data works in Azure Machine Learning, see [Access data in Azure storage services](how-to-access-data.md). For the benefits and structure of Azure Machine Learning pipelines, see [What are Azure Machine Learning pipelines?](concept-ml-pipelines.md).
20
+
This article provides code for importing, transforming, and moving data between steps in an Azure Machine Learning pipeline. For an overview of how data works in Azure Machine Learning, see [Access data in Azure storage services](how-to-access-data.md). For the benefits and structure of Azure Machine Learning pipelines, see [What are Azure Machine Learning pipelines?](concept-ml-pipelines.md).
Copy file name to clipboardExpand all lines: articles/machine-learning/tutorial-train-deploy-image-classification-model-vscode.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,6 +9,8 @@ ms.topic: tutorial
9
9
author: luisquintanilla
10
10
ms.author: luquinta
11
11
ms.date: 04/13/2020
12
+
ms.custom: contperfq4
13
+
12
14
#Customer intent: As a professional data scientist, I want to learn how to train and deploy an image classification model using TensorFlow and the Azure Machine Learning Visual Studio Code Extension.
0 commit comments