You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/join-azure-ssis-integration-runtime-virtual-network-ui.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -70,7 +70,7 @@ Use Azure portal to configure a classic virtual network before you try to join y
70
70
71
71
1. On the **Add role assignment** page, enter **Microsoft Azure Batch** in the search box, select the role, and select **Next**.
72
72
73
-
:::image type="content" source="media/join-azure-ssis-integration-runtime-virtual-network/add-vm-contributor-role.png" alt-text="Sreenshot showing search results for the "Virtual Machine Contributor" role.":::
73
+
:::image type="content" source="media/join-azure-ssis-integration-runtime-virtual-network/add-virtual-machine-contributor-role.png" alt-text="Sreenshot showing search results for the "Virtual Machine Contributor" role.":::
74
74
75
75
1. On the **Members** page, under **Members** select **+ Select members**. Then on the **Select Members** pane, search for **Microsoft Azure Batch**, and select it from the list to add it, and click **Select**.
76
76
@@ -90,7 +90,7 @@ After you've configured an Azure Resource Manager/classic virtual network, you c
90
90
91
91
1. In the [Azure portal](https://portal.azure.com), under the **Azure Services** section, select **More Services** to see a list of all Azure services. In the **Filter services** search box, type **Data Factories**, and then choose **Data Factories** in the list of services that appear.
92
92
93
-
:::image type="content" source="media/join-azure-ssis-integration-runtime-virtual-network/portal-find-data-factories.png" alt-text="Screenshot of the All Services page on the Azure Portal filtered for "Data Factories".":::
93
+
:::image type="content" source="media/join-azure-ssis-integration-runtime-virtual-network/portal-find-data-factories.png" alt-text="Screenshot of the All Services page on the Azure portal filtered for Data Factories.":::
94
94
95
95
:::image type="content" source="media/join-azure-ssis-integration-runtime-virtual-network/data-factories-list.png" alt-text="List of data factories":::
Copy file name to clipboardExpand all lines: articles/data-factory/join-azure-ssis-integration-runtime-virtual-network.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -52,9 +52,9 @@ When joining your Azure-SSIS IR to a virtual network, remember these important p
52
52
53
53
- If a classic virtual network is already connected to your on-premises network in a different location from your Azure-SSIS IR, you can create an [Azure Resource Manager virtual network](../virtual-network/quick-create-portal.md#create-a-virtual-network) for your Azure-SSIS IR to join. Then configure a [classic-to-Azure Resource Manager virtual network](../vpn-gateway/vpn-gateway-connect-different-deployment-models-portal.md) connection.
54
54
55
-
- If an Azure Resource Manager virtual network is already connected to your on-premises network in a different location from your Azure-SSIS IR, you can first create an [Azure Resource Manager virtual network](../virtual-network/quick-create-portal.md#create-a-virtual-network) for your Azure-SSIS IR to join. Then configure an [Azure Resource Manager-to-Azure Resource Manager virtual network](../vpn-gateway/vpn-gateway-howto-vnet-vnet-resource-manager-portal.md) connection.
55
+
- If an Azure Resource Manager network is already connected to your on-premises network in a different location from your Azure-SSIS IR, you can first create an [Azure Resource Manager virtual network](../virtual-network/quick-create-portal.md#create-a-virtual-network) for your Azure-SSIS IR to join. Then configure an [Azure Resource Manager-to-Azure Resource Manager virtual network](../vpn-gateway/vpn-gateway-howto-vnet-vnet-resource-manager-portal.md) connection.
56
56
57
-
## Hosting SSISDB in Azure SQL Database server or Managed Instance
57
+
## Hosting SSISDB in Azure SQL Database server or Managed instance
58
58
59
59
If you host SSISDB in Azure SQL Database server configured with a virtual network service endpoint, make sure that you join your Azure-SSIS IR to the same virtual network and subnet.
Copy file name to clipboardExpand all lines: articles/data-factory/load-azure-data-lake-storage-gen2-from-gen1.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,7 +32,7 @@ This article shows you how to use the Data Factory copy data tool to copy data f
32
32
33
33
## Create a data factory
34
34
35
-
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure Portal.
35
+
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure portal.
36
36
37
37
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
38
38
@@ -144,7 +144,7 @@ As a best practice, conduct a performance POC with a representative sample datas
144
144
145
145
3. If you have maximized the performance of a single copy activity, but have not yet achieved the throughput upper limits of your environment, you can run multiple copy activities in parallel.
146
146
147
-
When you see significant number of throttling errors from [copy activity monitoring](copy-activity-monitoring.md#monitor-visually), it indicates you have reached the capacity limit of your storage account. ADF will retry automatically to overcome each throttling error to make sure there will not be any data lost, but too many retries impact your copy throughput as well. In such case, you are encouraged to reduce the number of copy activities running cocurrently to avoid significant amounts of throttling errors. If you have been using single copy activity to copy data, then you are encouraged to reduce the DIU.
147
+
When you see significant number of throttling errors from [copy activity monitoring](copy-activity-monitoring.md#monitor-visually), it indicates you have reached the capacity limit of your storage account. ADF will retry automatically to overcome each throttling error to make sure there will not be any data lost, but too many retries can degrade your copy throughput as well. In such case, you are encouraged to reduce the number of copy activities running cocurrently to avoid significant amounts of throttling errors. If you have been using single copy activity to copy data, then you are encouraged to reduce the DIU.
Copy file name to clipboardExpand all lines: articles/data-factory/load-azure-data-lake-storage-gen2.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -33,7 +33,7 @@ This article shows you how to use the Data Factory Copy Data tool to load data f
33
33
34
34
## Create a data factory
35
35
36
-
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure Portal.
36
+
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure portal.
37
37
38
38
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
Copy file name to clipboardExpand all lines: articles/data-factory/load-azure-data-lake-store.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ This article shows you how to use the Data Factory Copy Data tool to _load data
38
38
39
39
## Create a data factory
40
40
41
-
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure Portal.
41
+
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure portal.
42
42
43
43
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
44
44
@@ -52,7 +52,7 @@ This article shows you how to use the Data Factory Copy Data tool to _load data
52
52
2. In the **Properties** page, specify **CopyFromAmazonS3ToADLS** for the **Task name** field, and select **Next**:
3. In the **Source data store** page, click**+ Create new connection**:
55
+
3. In the **Source data store** page, select**+ Create new connection**:
56
56
57
57
:::image type="content" source="./media/load-data-into-azure-data-lake-store/source-data-store-page.png" alt-text="Source data store page":::
58
58
@@ -79,7 +79,7 @@ This article shows you how to use the Data Factory Copy Data tool to _load data
79
79
80
80
:::image type="content" source="./media/load-data-into-azure-data-lake-store/specify-binary-copy.png" alt-text="Screenshot shows the Choose the input file or folder where you can select Copy file recursively and Binary Copy.":::
81
81
82
-
7. In the **Destination data store** page, click**+ Create new connection**, and then select **Azure Data Lake Storage Gen1**, and select **Continue**:
82
+
7. In the **Destination data store** page, select**+ Create new connection**, and then select **Azure Data Lake Storage Gen1**, and select **Continue**:
83
83
84
84
:::image type="content" source="./media/load-data-into-azure-data-lake-store/destination-data-storage-page.png" alt-text="Destination data store page":::
Copy file name to clipboardExpand all lines: articles/data-factory/load-azure-sql-data-warehouse.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,7 @@ This article shows you how to use the Copy Data tool to _load data from Azure SQ
40
40
41
41
## Create a data factory
42
42
43
-
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure Portal.
43
+
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure portal.
44
44
45
45
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
Copy file name to clipboardExpand all lines: articles/data-factory/load-office-365-data.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ This article shows you how to use the Data Factory _load data from Microsoft 365
17
17
18
18
## Create a data factory
19
19
20
-
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure Portal.
20
+
1. If you have not created your data factory yet, follow the steps in [Quickstart: Create a data factory by using the Azure portal and Azure Data Factory Studio](quickstart-create-data-factory-portal.md) to create one. After creating it, browse to the data factory in the Azure portal.
21
21
22
22
:::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
23
23
@@ -38,17 +38,17 @@ This article shows you how to use the Data Factory _load data from Microsoft 365
38
38
39
39
### Configure source
40
40
41
-
1. Go to the pipeline > **Source tab**, click**+ New** to create a source dataset.
41
+
1. Go to the pipeline > **Source tab**, select**+ New** to create a source dataset.
42
42
43
43
2. In the New Dataset window, select **Microsoft 365 (Office 365)**, and then select **Continue**.
44
44
45
-
3. You are now in the copy activity configuration tab. Click on the **Edit** button next to the Microsoft 365 (Office 365) dataset to continue the data configuration.
45
+
3. You are now in the copy activity configuration tab. Select on the **Edit** button next to the Microsoft 365 (Office 365) dataset to continue the data configuration.
46
46
47
47
:::image type="content" source="./media/load-office-365-data/transition-to-edit-dataset.png" alt-text="Config Microsoft 365 (Office 365) dataset general.":::
48
48
49
49
4. You see a new tab opened for Microsoft 365 (Office 365) dataset. In the **General tab** at the bottom of the Properties window, enter "SourceOffice365Dataset" for Name.
50
50
51
-
5. Go to the **Connection tab** of the Properties window. Next to the Linked service text box, click**+ New**.
51
+
5. Go to the **Connection tab** of the Properties window. Next to the Linked service text box, select**+ New**.
52
52
53
53
6. In the New Linked Service window, enter "Office365LinkedService" as name, enter the service principal ID and service principal key, then test connection and select **Create** to deploy the linked service.
54
54
@@ -62,7 +62,7 @@ This article shows you how to use the Data Factory _load data from Microsoft 365
62
62
63
63
9. You are required to choose one of the date filters and provide the start time and end time values.
64
64
65
-
10.Click on the **Import Schema** tab to import the schema for Message dataset.
65
+
10.Select on the **Import Schema** tab to import the schema for Message dataset.
66
66
67
67
:::image type="content" source="./media/load-office-365-data/edit-source-properties.png" alt-text="Config Microsoft 365 (Office 365) dataset schema.":::
68
68
@@ -73,7 +73,7 @@ This article shows you how to use the Data Factory _load data from Microsoft 365
73
73
74
74
2. In the New Dataset window, notice that only the supported destinations are selected when copying from Microsoft 365 (Office 365). Select **Azure Blob Storage**, select Binary format, and then select **Continue**. In this tutorial, you copy Microsoft 365 (Office 365) data into an Azure Blob Storage.
75
75
76
-
3.Click on **Edit** button next to the Azure Blob Storage dataset to continue the data configuration.
76
+
3.Select on **Edit** button next to the Azure Blob Storage dataset to continue the data configuration.
77
77
78
78
4. On the **General tab** of the Properties window, in Name, enter "OutputBlobDataset".
79
79
@@ -110,7 +110,7 @@ To see activity runs associated with the pipeline run, select the **View Activit
If this is the first time you are requesting data for this context (a combination of which data table is being access, which destination account is the data being loaded into, and which user identity is making the data access request), you will see the copy activity status as **In Progress**, and only when you click into "Details" link under Actions will you see the status as **RequesetingConsent**. A member of the data access approver group needs to approve the request in the Privileged Access Management before the data extraction can proceed.
113
+
If this is the first time you are requesting data for this context (a combination of which data table is being access, which destination account is the data being loaded into, and which user identity is making the data access request), you will see the copy activity status as **In Progress**, and only when you select into "Details" link under Actions will you see the status as **RequesetingConsent**. A member of the data access approver group needs to approve the request in the Privileged Access Management before the data extraction can proceed.
0 commit comments