Skip to content

Commit d401aca

Browse files
committed
Revert "Edits resolved"
This reverts commit f5be85c.
1 parent f5be85c commit d401aca

File tree

1 file changed

+16
-42
lines changed

1 file changed

+16
-42
lines changed

articles/data-factory/tutorial-copy-data-portal.md

Lines changed: 16 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,14 @@
11
---
2-
title: 'Use the Azure portal to create a data factory pipeline'
3-
description: This tutorial provides instructions to create a data factory with a pipeline with a copy activity to copy data from Azure Blob storage to Azure SQL Database.
2+
title: Use the Azure portal to create a data factory pipeline
3+
description: This tutorial provides step-by-step instructions for using the Azure portal to create a data factory with a pipeline. The pipeline uses the copy activity to copy data from Azure Blob storage to Azure SQL Database.
44
author: jianleishen
55
ms.topic: tutorial
66
ms.date: 04/25/2025
77
ms.subservice: data-movement
88
ms.author: jianleishen
9-
10-
#customer intent: As a new Azure Data Factory user I want to create a data factory and quickly create my first pipeline to move data between resources, so I can apply it to my own needs.
119
---
1210

13-
# Tutorial: Copy data from Azure Blob storage to a database in Azure SQL Database by using Azure Data Factory
11+
# Copy data from Azure Blob storage to a database in Azure SQL Database by using Azure Data Factory
1412

1513
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
1614

@@ -22,13 +20,12 @@ In this tutorial, you create a data factory by using the Azure Data Factory user
2220
In this tutorial, you perform the following steps:
2321

2422
> [!div class="checklist"]
25-
> * [Create a data factory.](#create-a-data-factory)
26-
> * [Create a pipeline with a copy activity.](#create-a-pipeline)
23+
> * Create a data factory.
24+
> * Create a pipeline with a copy activity.
2725
> * Test run the pipeline.
28-
> * [Trigger the pipeline manually.](#trigger-the-pipeline-manually)
29-
> * [Trigger the pipeline on a schedule.](#trigger-the-pipeline-on-a-schedule)
26+
> * Trigger the pipeline manually.
27+
> * Trigger the pipeline on a schedule.
3028
> * Monitor the pipeline and activity runs.
31-
> * [Disable or delete your scheduled trigger.](#disable-trigger)
3229
3330
## Prerequisites
3431

@@ -69,7 +66,7 @@ Now, prepare your Blob storage and SQL database for the tutorial by performing t
6966
CREATE CLUSTERED INDEX IX_emp_ID ON dbo.emp (ID);
7067
```
7168
72-
1. Allow Azure services to access SQL Server. Ensure that **Allow access to Azure services** is turned **ON** for your SQL Server so that Data Factory can write data to your SQL Server. To verify and turn on this setting, go to your SQL Server in the Azure portal, select **Security** > **Networking** > enable **Selected networks**> check **Allow Azure services and resources to access this server** under the **Exceptions**.
69+
1. Allow Azure services to access SQL Server. Ensure that **Allow access to Azure services** is turned **ON** for your SQL Server so that Data Factory can write data to your SQL Server. To verify and turn on this setting, go to your SQL Server in the Azure portal, select **Security** > **Networking** > enable **Selected networks**> chech **Allow Azure services and resources to access this server** under the **Exceptions**.
7370
7471
## Create a data factory
7572
@@ -113,7 +110,7 @@ In this step, you create a pipeline with a copy activity in the data factory. Th
113110
### Configure source
114111
115112
>[!TIP]
116-
>In this tutorial, you use *Account key* as the authentication type for your source data store, but you can choose other supported authentication methods: *SAS URI*, *Service Principal*, and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
113+
>In this tutorial, you use *Account key* as the authentication type for your source data store, but you can choose other supported authentication methods: *SAS URI*,*Service Principal* and *Managed Identity* if needed. Refer to corresponding sections in [this article](./connector-azure-blob-storage.md#linked-service-properties) for details.
117114
>To store secrets for data stores securely, it's also recommended to use an Azure Key Vault. Refer to [this article](./store-credentials-in-key-vault.md) for detailed illustrations.
118115
119116
1. Go to the **Source** tab. Select **+ New** to create a source dataset.
@@ -164,7 +161,7 @@ In this step, you create a pipeline with a copy activity in the data factory. Th
164161
165162
:::image type="content" source="./media/tutorial-copy-data-portal/new-azure-sql-linked-service-window.png" alt-text="Save new linked service":::
166163
167-
1. It automatically navigates to the **Set Properties** dialog box. In **Table**, select **Enter manually**, and enter **[dbo].[emp]**. Then select **OK**.
164+
1. It automatically navigates to the **Set Properties** dialog box. In **Table**, select **[dbo].[emp]**. Then select **OK**.
168165
169166
1. Go to the tab with the pipeline, and in **Sink Dataset**, confirm that **OutputSqlDataset** is selected.
170167
@@ -173,49 +170,42 @@ In this step, you create a pipeline with a copy activity in the data factory. Th
173170
You can optionally map the schema of the source to corresponding schema of destination by following [Schema mapping in copy activity](copy-activity-schema-and-type-mapping.md).
174171
175172
## Validate the pipeline
176-
177173
To validate the pipeline, select **Validate** from the tool bar.
178174
179175
You can see the JSON code associated with the pipeline by clicking **Code** on the upper right.
180176
181177
## Debug and publish the pipeline
182-
183178
You can debug a pipeline before you publish artifacts (linked services, datasets, and pipeline) to Data Factory or your own Azure Repos Git repository.
184179
185180
1. To debug the pipeline, select **Debug** on the toolbar. You see the status of the pipeline run in the **Output** tab at the bottom of the window.
186181
187182
1. Once the pipeline can run successfully, in the top toolbar, select **Publish all**. This action publishes entities (datasets, and pipelines) you created to Data Factory.
188183
189-
1. Wait until you see the **Successfully published** notification message. To see notification messages, select the **Show Notifications** on the top-right (bell button).
184+
1. Wait until you see the **Successfully published** message. To see notification messages, click the **Show Notifications** on the top-right (bell button).
190185
191186
## Trigger the pipeline manually
192-
193187
In this step, you manually trigger the pipeline you published in the previous step.
194188
195-
1. Select **Add trigger** on the toolbar, and then select **Trigger Now**.
196-
197-
1. On the **Pipeline Run** page, select **OK**.
189+
1. Select **Trigger** on the toolbar, and then select **Trigger Now**. On the **Pipeline Run** page, select **OK**.
198190
199191
1. Go to the **Monitor** tab on the left. You see a pipeline run that is triggered by a manual trigger. You can use links under the **PIPELINE NAME** column to view activity details and to rerun the pipeline.
200192
201193
:::image type="content" source="./media/tutorial-copy-data-portal/monitor-pipeline-inline-and-expended.png" alt-text="Monitor pipeline runs" lightbox="./media/tutorial-copy-data-portal/monitor-pipeline-inline-and-expended.png":::
202194
203-
1. To see activity runs associated with the pipeline run, select the **CopyPipeline** link under the **PIPELINE NAME** column. In this example, there's only one activity, so you see only one entry in the list. For details about the copy operation, hover over the activity and
204-
1. select the **Details** link (eyeglasses icon) under the **ACTIVITY NAME** column. Select **All pipeline runs** at the top to go back to the Pipeline Runs view. To refresh the view, select **Refresh**.
195+
1. To see activity runs associated with the pipeline run, select the **CopyPipeline** link under the **PIPELINE NAME** column. In this example, there's only one activity, so you see only one entry in the list. For details about the copy operation, select the **Details** link (eyeglasses icon) under the **ACTIVITY NAME** column. Select **All pipeline runs** at the top to go back to the Pipeline Runs view. To refresh the view, select **Refresh**.
205196
206197
:::image type="content" source="./media/tutorial-copy-data-portal/view-activity-runs-inline-and-expended.png#lightbox" alt-text="Monitor activity runs" lightbox="./media/tutorial-copy-data-portal/view-activity-runs-inline-and-expended.png":::
207198
208199
1. Verify that two more rows are added to the **emp** table in the database.
209200
210201
## Trigger the pipeline on a schedule
211-
212202
In this schedule, you create a schedule trigger for the pipeline. The trigger runs the pipeline on the specified schedule, such as hourly or daily. Here you set the trigger to run every minute until the specified end datetime.
213203
214204
1. Go to the **Author** tab on the left above the monitor tab.
215205
216-
1. Go to your pipeline, select **Trigger** on the tool bar, and select **New/Edit**.
206+
1. Go to your pipeline, click **Trigger** on the tool bar, and select **New/Edit**.
217207
218-
1. In the **Add triggers** dialog box, select **Choose trigger** and select **+ New**.
208+
1. In the **Add triggers** dialog box, select **+ New** for **Choose trigger** area.
219209
220210
1. In the **New Trigger** window, take the following steps:
221211
@@ -238,7 +228,7 @@ In this schedule, you create a schedule trigger for the pipeline. The trigger ru
238228
239229
1. On the **Edit trigger** page, review the warning, and then select **Save**. The pipeline in this example doesn't take any parameters.
240230
241-
1. Select **Publish all** to publish the change.
231+
1. Click **Publish all** to publish the change.
242232
243233
1. Go to the **Monitor** tab on the left to see the triggered pipeline runs.
244234
@@ -250,22 +240,7 @@ In this schedule, you create a schedule trigger for the pipeline. The trigger ru
250240
251241
1. Verify that two rows per minute (for each pipeline run) are inserted into the **emp** table until the specified end time.
252242
253-
## Disable trigger
254-
255-
To disable your every minute trigger that you created, follow these steps:
256-
257-
1. Select the **Manage** pane on the left side.
258-
259-
1. Under **Author** select **Triggers**.
260-
261-
1. Hover over the **RunEveryMinute** trigger you created.
262-
1. Select the **Stop** button to disable the trigger from running.
263-
1. Select the **Delete** button to disable and delete the trigger.
264-
265-
1. Select **Publish all** to save your changes.
266-
267243
## Related content
268-
269244
The pipeline in this sample copies data from one location to another location in Blob storage. You learned how to:
270245
271246
> [!div class="checklist"]
@@ -275,7 +250,6 @@ The pipeline in this sample copies data from one location to another location in
275250
> * Trigger the pipeline manually.
276251
> * Trigger the pipeline on a schedule.
277252
> * Monitor the pipeline and activity runs.
278-
> * Disable or delete your scheduled trigger.
279253
280254
281255
Advance to the following tutorial to learn how to copy data from on-premises to the cloud:

0 commit comments

Comments
 (0)