You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-use-batch-azure-data-factory.md
+12-7Lines changed: 12 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ Azure Data Factory can invoke the REST APIs of batch endpoints by using the [Web
38
38
You can use a service principal or a [managed identity](../active-directory/managed-identities-azure-resources/overview.md) to authenticate against Batch Endpoints. We recommend using a managed identity as it simplifies the use of secrets.
39
39
40
40
> [!IMPORTANT]
41
-
> When your data is stored in cloud locations instead of Azure Machine Learning Data Stores, the identity of the compute is used to read the data instead of the identity used to invoke the endpoint.
41
+
> Batch Endpoints can consume data stored in storage accounts instead of Azure Machine Learning Data Stores or Data Assets. However, you may need to configure additional permissions for the identity of the compute where the batch endpoint runs on. See [Security considerations when reading data](how-to-access-data-batch-endpoints-jobs.md#security-considerations-when-reading-data).
42
42
43
43
# [Using a Managed Identity](#tab/mi)
44
44
@@ -132,11 +132,16 @@ The pipeline requires the following parameters to be configured:
132
132
To create this pipeline in your existing Azure Data Factory, follow these steps:
133
133
134
134
1. Open Azure Data Factory Studio and under __Factory Resources__ click the plus sign.
135
-
2. Select __Pipeline__ > __Import from pipeline template__
136
-
3. You will be prompted to select a `zip` file. Uses [the following template if using managed identities](https://azuremlexampledata.blob.core.windows.net/data/templates/batch-inference/Run-BatchEndpoint-MI.zip) or [the following one if using a service principal](https://azuremlexampledata.blob.core.windows.net/data/templates/batch-inference/Run-BatchEndpoint-SP.zip).
137
-
4. A preview of the pipeline will show up in the portal. Click __Use this template__.
138
-
5. The pipeline will be created for you with the name __Run-BatchEndpoint__.
139
-
6. Configure the parameters of the batch deployment you are using:
135
+
136
+
1. Select __Pipeline__ > __Import from pipeline template__
137
+
138
+
1. You will be prompted to select a `zip` file. Uses [the following template if using managed identities](https://azuremlexampledata.blob.core.windows.net/data/templates/batch-inference/Run-BatchEndpoint-MI.zip) or [the following one if using a service principal](https://azuremlexampledata.blob.core.windows.net/data/templates/batch-inference/Run-BatchEndpoint-SP.zip).
139
+
140
+
1. A preview of the pipeline will show up in the portal. Click __Use this template__.
141
+
142
+
1. The pipeline will be created for you with the name __Run-BatchEndpoint__.
143
+
144
+
1. Configure the parameters of the batch deployment you are using:
140
145
141
146
# [Using a Managed Identity](#tab/mi)
142
147
@@ -152,7 +157,7 @@ To create this pipeline in your existing Azure Data Factory, follow these steps:
152
157
> Ensure that your batch endpoint has a default deployment configured before submitting a job to it. The created pipeline will invoke the endpoint and hence a default deployment needs to be created and configured.
153
158
154
159
> [!TIP]
155
-
> For best reusability, use the created pipeline as a template and call it from within other Azure Data Factory pipelines by leveraging the [Execute pipeline activity](../data-factory/control-flow-execute-pipeline-activity.md). In that case, do not configure the parameters in the created pipeline but pass them when you are executing the pipeline.
160
+
> For best reusability, use the created pipeline as a template and call it from within other Azure Data Factory pipelines by leveraging the [Execute pipeline activity](../data-factory/control-flow-execute-pipeline-activity.md). In that case, do not configure the parameters in the inner pipeline but pass them as parameters from the outer pipeline as shown in the following image:
156
161
>
157
162
> :::image type="content" source="./media/how-to-use-batch-adf/pipeline-run.png" alt-text="Screenshot of the pipeline parameters expected for the resulting pipeline when invoked from another pipeline.":::
0 commit comments