You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/tutorial-incremental-copy-change-tracking-feature-portal.md
+14-14Lines changed: 14 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -132,7 +132,7 @@ In this tutorial, you create two pipelines that perform the following operations
132
132

133
133
1. On the **New data factory** page, enter **ADFTutorialDataFactory** for the name.
134
134
135
-
The name of data factory must be globally unique. If you get an error that says the name that you chose is not available, change the name (for example, to **yournameADFTutorialDataFactory**) and try creating the data factory again. For more information, see [Data Factory naming rules](naming-rules.md).
135
+
The name of data factory must be globally unique. If you get an error that says the name that you chose is not available, change the name (for example, to **yournameADFTutorialDataFactory**) and try creating the data factory again. For more information, see [Azure Data Factory naming rules](naming-rules.md).
136
136
137
137
1. Select the Azure subscription in which you want to create the data factory.
138
138
1. For **Resource Group**, take one of the following steps:
@@ -150,7 +150,7 @@ In this tutorial, you create two pipelines that perform the following operations
150
150
1. Select**Review + create**.
151
151
1. Select**Create**.
152
152
153
-
On the dashboard, the **Deploying data factory** tile shows the status.
153
+
On the dashboard, the **Deploying Data Factory** tile shows the status.
154
154
155
155
:::image type="content" source="media/tutorial-incremental-copy-change-tracking-feature-portal/deploying-data-factory.png" alt-text="Screenshot of the tile that shows the status of deploying a data factory.":::
156
156
1. After the creation is complete, the **Data Factory** page appears. Select the **Launch studio** tile to open the Azure Data Factory UI on a separate tab.
@@ -161,7 +161,7 @@ You create linked services in a data factory to link your data stores and comput
161
161
162
162
### Create an Azure Storage linked service
163
163
164
-
To link your Azure storage account to the data factory:
164
+
To link your storage account to the data factory:
165
165
166
166
1. In the Data Factory UI, on the **Manage** tab, under **Connections**, select**Linked services**. Then select**+ New**or the **Create linked service** button.
167
167

@@ -177,7 +177,7 @@ To link your Azure storage account to the data factory:
177
177
178
178
To link your database to the data factory:
179
179
180
-
1. In the Data Factory UI, on the **Manage** tab, under **Connections**, select**Linked services**>**+ New**.
180
+
1. In the Data Factory UI, on the **Manage** tab, under **Connections**, select**Linked services**. Then select**+ New**.
181
181
1. In the **New Linked Service** window, select**Azure SQL Database**, and then select**Continue**.
182
182
1. Enter the following information:
183
183
1. For **Name**, enter **AzureSqlDatabaseLinkedService**.
@@ -202,7 +202,7 @@ In this section, you create datasets to represent the data source and data desti
202
202

203
203
1. Select**Azure SQL Database**, and then select**Continue**.
204
204
1. In the **Set Properties** window, take the following steps:
205
-
1. For **Name**, enter **SourceDataset**as the name of the dataset.
205
+
1. For **Name**, enter **SourceDataset**.
206
206
1. For **Linked service**, select**AzureSqlDatabaseLinkedService**.
207
207
1. For **Table name**, select**dbo.data_source_table**.
208
208
1. For **Import schema**, select the **From connection/store** option.
@@ -220,9 +220,9 @@ In the following procedure, you create a dataset to represent the data that's co
220
220
1. Select**Azure Blob Storage**, and then select**Continue**.
221
221
1. Select the format of the data type as**DelimitedText**, and then select**Continue**.
222
222
1. In the **Set properties** window, take the following steps:
223
-
1. For **Name**, enter **SinkDataset** the name of the dataset.
223
+
1. For **Name**, enter **SinkDataset**.
224
224
1. For **Linked service**, select**AzureBlobStorageLinkedService**.
225
-
1. For **File path**, enter **adftutorial/incchgtracking**as the folder name.
225
+
1. For **File path**, enter **adftutorial/incchgtracking**.
226
226
1. Select**OK**.
227
227
1. After the dataset appears in the tree view, go to the **Connection** tab andselect the **File name**textbox. When the **Add dynamic content** option appears, select it.
228
228
@@ -237,7 +237,7 @@ In the following procedure, you create a dataset for storing the change tracking
237
237
1. In the Data Factory UI, on the **Author** tab, select**+**, and then select**Dataset**.
238
238
1. Select**Azure SQL Database**, and then select**Continue**.
239
239
1. In the **Set Properties** window, take the following steps:
240
-
1. For **Name**, enter **ChangeTrackingDataset**as the name of the dataset.
240
+
1. For **Name**, enter **ChangeTrackingDataset**.
241
241
1. For **Linked service**, select**AzureSqlDatabaseLinkedService**.
242
242
1. For **Table name**, select**dbo.table_store_ChangeTracking_version**.
243
243
1. For **Import schema**, select the **From connection/store** option.
@@ -252,7 +252,7 @@ In the following procedure, you create a pipeline with a copy activity that copi
252
252

253
253
1. A new tab appears for configuring the pipeline. The pipeline also appears in the tree view. In the **Properties** window, change the name of the pipeline to **FullCopyPipeline**.
254
254
1. In the **Activities** toolbox, expand **Move & transform**. Take one of the following steps:
255
-
- Drag the **Copy** activity to the pipeline designer surface.
255
+
- Drag the copy activity to the pipeline designer surface.
256
256
-On the search bar under **Activities**, search for the copy data activity, and then set the name to **FullCopyActivity**.
257
257
1. Switch to the **Source** tab. For **Source Dataset**, select**SourceDataset**.
258
258
1. Switch to the **Sink** tab. For **Sink Dataset**, select**SinkDataset**.
@@ -324,7 +324,7 @@ SET [Age] = '10', [name]='update' where [PersonID] = 1
324
324
325
325
In the following procedure, you create a pipeline with activities and run it periodically. When you run the pipeline:
326
326
327
-
- The *lookup activities* get the old and new `SYS_CHANGE_VERSION` values from Azure SQL Database and pass it to the copy activity.
327
+
- The *lookup activities* get the old and new `SYS_CHANGE_VERSION` values from Azure SQL Database and pass them to the copy activity.
328
328
- The *copy activity* copies the inserted, updated, or deleted data between the two `SYS_CHANGE_VERSION` values from Azure SQL Database to Azure Blob Storage.
329
329
- The *stored procedure activity* updates the value of `SYS_CHANGE_VERSION` for the next pipeline run.
330
330
@@ -335,7 +335,7 @@ In the following procedure, you create a pipeline with activities and run it per
335
335
3. Expand **General** in the **Activities** toolbox. Drag the lookup activity to the pipeline designer surface, or search in the **Search activities** box. Set the name of the activity to **LookupLastChangeTrackingVersionActivity**. This activity gets the change tracking version used in the last copy operation that's stored in the `table_store_ChangeTracking_version` table.
336
336
4. Switch to the **Settings** tab in the **Properties** window. For **Source Dataset**, select **ChangeTrackingDataset**.
337
337
5. Drag the lookup activity from the **Activities** toolbox to the pipeline designer surface. Set the name of the activity to **LookupCurrentChangeTrackingVersionActivity**. This activity gets the current change tracking version.
338
-
6. Switch to the **Settings** tab in the **Properties** window, and take the following steps:
338
+
6. Switch to the **Settings** tab in the **Properties** window, and then take the following steps:
339
339
340
340
1. For **Source dataset**, select **SourceDataset**.
341
341
2. For **Use query**, select **Query**.
@@ -362,19 +362,19 @@ In the following procedure, you create a pipeline with activities and run it per
362
362
9. Switch to the **Sink** tab. For **Sink Dataset**, select**SinkDataset**.
363
363
10. Connect both lookup activities to the copy activity one by one. Drag the green button attached to the lookup activity to the copy activity.
364
364
11. Drag the stored procedure activity from the **Activities** toolbox to the pipeline designer surface. Set the name of the activity to **StoredProceduretoUpdateChangeTrackingActivity**. This activity updates the change tracking version in the `table_store_ChangeTracking_version` table.
365
-
12. Switch to the **Settings** tab, and take the following steps:
365
+
12. Switch to the **Settings** tab, andthen take the following steps:
366
366
367
367
1. For **Linked service**, select**AzureSqlDatabaseLinkedService**.
368
368
2. For **Stored procedure name**, select**Update_ChangeTracking_Version**.
369
369
3. Select**Import**.
370
-
4. In the **Stored procedure parameters** section, specify following values for the parameters:
370
+
4. In the **Stored procedure parameters** section, specify the following values for the parameters:

377
+

378
378
379
379
13. Connect the copy activity to the stored procedure activity. Drag the green button attached to the copy activity to the stored procedure activity.
380
380
14. Select**Validate**on the toolbar. Confirm that there are no validation errors. Close the **Pipeline Validation Report** window.
0 commit comments