You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/operator-insights/ingestion-with-data-factory.md
+22-22Lines changed: 22 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,16 +32,16 @@ For more information on Azure Data Factory, see [What is Azure Data Factory](/az
32
32
To connect Azure Data Factory to another Azure service, you must create a [linked service](/azure/data-factory/concepts-linked-services?tabs=data-factory). First, create a linked service to connect Azure Data Factory to the Data Product's key vault.
33
33
34
34
1. In the [Azure portal](https://ms.portal.azure.com/#home), find the Azure Data Factory resource.
35
-
2. From the **Overview** pane, launch the Azure Data Factory studio.
36
-
3. Go to the **Manage** view, then find **Connections** and select **Linked Services**.
37
-
4. Create a new linked service using the **New** button.
38
-
- Select the **Azure Key Vault** type.
39
-
- Set the target to the Data Product's key vault (the key vault is in the resource group with name starting with `<data-product-name>-HostedResources-` and is named `aoi-<uid>-kv`).
40
-
- Set the authentication method to **System Assigned Managed Identity**.
41
-
5. Grant Azure Data Factory permissions on the Key Vault resource.
42
-
- Go to the Data Product's key vault in the Azure portal.
43
-
- In the **Access Control (IAM)** pane, add a new role assignment.
44
-
- Give the Data Factory managed identity (this has the same name as the Data Factory resource) the 'Key Vault Secrets User' role.
35
+
1. From the **Overview** pane, launch the Azure Data Factory studio.
36
+
1. Go to the **Manage** view, then find **Connections** and select **Linked Services**.
37
+
1. Create a new linked service using the **New** button.
38
+
1. Select the **Azure Key Vault** type.
39
+
1. Set the target to the Data Product's key vault (the key vault is in the resource group with name starting with `<data-product-name>-HostedResources-` and is named `aoi-<uid>-kv`).
40
+
1. Set the authentication method to **System Assigned Managed Identity**.
41
+
1. Grant Azure Data Factory permissions on the Key Vault resource.
42
+
1. Go to the Data Product's key vault in the Azure portal.
43
+
1. In the **Access Control (IAM)** pane, add a new role assignment.
44
+
1. Give the Data Factory managed identity (this has the same name as the Data Factory resource) the 'Key Vault Secrets User' role.
45
45
46
46
## Create a Blob Storage linked service
47
47
@@ -51,12 +51,12 @@ Data Products expose a Blob Storage endpoint for ingesting data. Use the newly c
51
51
2. From the **Overview** pane, launch the Azure Data Factory studio.
52
52
3. Go to the **Manage** view, then find **Connections** and select **Linked Services**.
53
53
4. Create a new linked service using the **New** button.
54
-
- Select the Azure Blob Storage type.
55
-
- Set the authentication type to **SAS URI**.
56
-
- Choose **Azure Key Vault** as the source.
57
-
- Select the Key Vault linked service that you created in [Create a key vault linked service](#create-a-key-vault-linked-service).
58
-
- Set the secret name to `input-storage-sas`.
59
-
- Leave the secret version as the default value ('Latest version').
54
+
1. Select the Azure Blob Storage type.
55
+
1. Set the authentication type to **SAS URI**.
56
+
1. Choose **Azure Key Vault** as the source.
57
+
1. Select the Key Vault linked service that you created in [Create a key vault linked service](#create-a-key-vault-linked-service).
58
+
1. Set the secret name to `input-storage-sas`.
59
+
1. Leave the secret version as the default value ('Latest version').
60
60
61
61
Now the Data Factory is connected to the Data Product ingestion endpoint.
62
62
@@ -68,13 +68,13 @@ To use the Data Product as the sink for a [Data Factory pipeline](/azure/data-fa
68
68
2. From the **Overview** pane, launch the Azure Data Factory studio.
69
69
3. Go to the **Author** view -> Add resource -> Dataset.
70
70
4. Create a new Azure Blob Storage dataset.
71
-
- Select your output type.
72
-
- Set the linked service to the Data Product ingestion linked service that you created in [Create a blob storage linked service](#create-a-blob-storage-linked-service).
73
-
- Set the container name to the name of the data type that the dataset is associated with.
71
+
1. Select your output type.
72
+
1. Set the linked service to the Data Product ingestion linked service that you created in [Create a blob storage linked service](#create-a-blob-storage-linked-service).
73
+
1. Set the container name to the name of the data type that the dataset is associated with.
74
74
- This information can be found in the **Required ingestion configuration** section of the documentation for your Data Product.
75
75
- For example, see [Required ingestion configuration](concept-monitoring-mcc-data-product.md#required-ingestion-configuration) for the Monitoring - MCC Data Product.
76
-
- Ensure the folder path includes at least one directory; files copied into the root of the container won't be correctly ingested.
77
-
- Set the other fields as appropriate for your data.
76
+
1. Ensure the folder path includes at least one directory; files copied into the root of the container won't be correctly ingested.
77
+
1. Set the other fields as appropriate for your data.
78
78
5. Follow the Azure Data Factory documentation (for example [Creating a pipeline with the UI](/azure/data-factory/concepts-pipelines-activities?tabs=data-factory#creating-a-pipeline-with-ui)) to create a pipeline with this new dataset as the sink.
79
79
80
80
Repeat this step for all required datasets.
@@ -94,4 +94,4 @@ Your Azure Data Factory is now configured to connect to your Data Product. To in
0 commit comments