You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/quickstart-hello-world-copy-data-tool.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,7 @@ You will be redirected to the configuration page shown in the image below to dep
31
31
32
32
A new blob storage account will be created in the new resource group, and the moviesDB2.csv file will be stored in a folder called **input** in the blob storage.
33
33
34
-
:::image type="content" source="media/quickstart-copy-data-tool/deploy-template.png" alt-text="A screenshot of the deployment template creation dialog.":::
34
+
:::image type="content" source="media/quickstart-hello-world-copy-data-tool/deploy-template.png" alt-text="A screenshot of the deployment template creation dialog.":::
35
35
36
36
### Create a data factory
37
37
@@ -49,51 +49,51 @@ The steps below will walk you through how to easily copy data with the copy data
49
49
50
50
1. On the **Properties** page of the Copy Data tool, choose **Built-in copy task** under **Task type**, then select **Next**.
1. Click **+ Create new connection** to add a connection.
57
57
58
58
1. Select the linked service type that you want to create for the source connection. In this tutorial, we use **Azure Blob Storage**. Select it from the gallery, and then select **Continue**.
1. On the **New connection (Azure Blob Storage)** page, specify a name for your connection. Select your Azure subscription from the **Azure subscription** list and your storage account from the **Storage account name** list, test connection, and then select **Create**.
63
63
64
-
:::image type="content" source="./media/quickstart-copy-data-tool/configure-blob-storage.png" alt-text="Configure the Azure Blob storage account":::
64
+
:::image type="content" source="./media/quickstart-hello-world-copy-data-tool/configure-blob-storage.png" alt-text="Configure the Azure Blob storage account":::
65
65
66
66
1. Select the newly created connection in the **Connection** block.
67
67
1. In the **File or folder** section, select **Browse** to navigate to the **adftutorial/input** folder, select the **emp.txt** file, and then click **OK**.
68
68
1. Select the **Binary copy** checkbox to copy file as-is, and then select **Next**.
69
69
70
-
:::image type="content" source="./media/quickstart-copy-data-tool/source-data-store.png" alt-text="Screenshot that shows the Source data store page.":::
70
+
:::image type="content" source="./media/quickstart-hello-world-copy-data-tool/source-data-store.png" alt-text="Screenshot that shows the Source data store page.":::
71
71
72
72
### Step 3: Complete destination configuration
73
73
1. Select the **AzureBlobStorage** connection that you created in the **Connection** block.
74
74
75
75
1. In the **Folder path** section, enter **adftutorial/output** for the folder path.
76
76
77
-
:::image type="content" source="./media/quickstart-copy-data-tool/destination-data-store.png" alt-text="Screenshot that shows the Destination data store page.":::
77
+
:::image type="content" source="./media/quickstart-hello-world-copy-data-tool/destination-data-store.png" alt-text="Screenshot that shows the Destination data store page.":::
78
78
79
79
1. Leave other settings as default and then select **Next**.
80
80
81
81
### Step 4: Review all settings and deployment
82
82
83
83
1. On the **Settings** page, specify a name for the pipeline and its description, then select **Next** to use other default configurations.
84
84
85
-
:::image type="content" source="./media/quickstart-copy-data-tool/settings.png" alt-text="Screenshot that shows the settings page.":::
85
+
:::image type="content" source="./media/quickstart-hello-world-copy-data-tool/settings.png" alt-text="Screenshot that shows the settings page.":::
86
86
87
87
1. On the **Summary** page, review all settings, and select **Next**.
88
88
89
89
1. On the **Deployment complete** page, select **Monitor** to monitor the pipeline that you created.
1. The application switches to the **Monitor** tab. You see the status of the pipeline on this tab. Select **Refresh** to refresh the list. Click the link under **Pipeline name** to view activity run details or rerun the pipeline.
1. On the Activity runs page, select the **Details** link (eyeglasses icon) under the **Activity name** column for more details about copy operation. For details about the properties, see [Copy Activity overview](copy-activity-overview.md).
Copy file name to clipboardExpand all lines: articles/machine-learning/v1/how-to-access-data.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -296,7 +296,7 @@ For situations where the SDK doesn't provide access to datastores, you might be
296
296
297
297
## Move data to supported Azure storage solutions
298
298
299
-
Azure Machine Learning supports accessing data from Azure Blob storage, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, and Azure Database for PostgreSQL. If you're using unsupported storage, we recommend that you move your data to supported Azure storage solutions by using [Azure Data Factory and these steps](../../data-factory/quickstart-create-data-factory-copy-data-tool.md). Moving data to supported storage can help you save data egress costs during machine learning experiments.
299
+
Azure Machine Learning supports accessing data from Azure Blob storage, Azure Files, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Azure SQL Database, and Azure Database for PostgreSQL. If you're using unsupported storage, we recommend that you move your data to supported Azure storage solutions by using [Azure Data Factory and these steps](../../data-factory/quickstart-hello-world-copy-data-tool.md). Moving data to supported storage can help you save data egress costs during machine learning experiments.
300
300
301
301
Azure Data Factory provides efficient and resilient data transfer with more than 80 prebuilt connectors at no extra cost. These connectors include Azure data services, on-premises data sources, Amazon S3 and Redshift, and Google BigQuery.
Copy file name to clipboardExpand all lines: articles/sentinel/migration-ingestion-tool.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -78,7 +78,7 @@ Review the Azure Data Factory (ADF) and Azure Synapse methods, which are better
78
78
To use the Copy activity in Azure Data Factory (ADF) or Synapse pipelines:
79
79
1. Create and configure a self-hosted integration runtime. This component is responsible for copying the data from your on-premises host.
80
80
1. Create linked services for the source data store ([filesystem](../data-factory/connector-file-system.md?tabs=data-factory#create-a-file-system-linked-service-using-ui) and the sink data store [blob storage](../data-factory/connector-azure-blob-storage.md?tabs=data-factory#create-an-azure-blob-storage-linked-service-using-ui).
81
-
3. To copy the data, use the [Copy data tool](../data-factory/quickstart-create-data-factory-copy-data-tool.md). Alternatively, you can use method such as PowerShell, Azure portal, a .NET SDK, and so on.
81
+
3. To copy the data, use the [Copy data tool](../data-factory/quickstart-hello-world-copy-data-tool.md). Alternatively, you can use method such as PowerShell, Azure portal, a .NET SDK, and so on.
0 commit comments