You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-access-data-batch-endpoints-jobs.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -276,17 +276,17 @@ Batch endpoints provide a durable API that consumers can use to create batch job
276
276
277
277
Batch endpoints support two types of inputs:
278
278
279
-
-[Data inputs](#data-inputs): Pointers to a specific storage location or Azure Machine Learning asset.
280
-
-[Literal inputs](#literal-inputs): Literal values like numbers or strings that you want to pass to the job.
279
+
-[Data inputs](#explore-data-inputs): Pointers to a specific storage location or Azure Machine Learning asset.
280
+
-[Literal inputs](#explore-literal-inputs): Literal values like numbers or strings that you want to pass to the job.
281
281
282
282
The number and type of inputs and outputs depend on the [type of batch deployment](concept-endpoints-batch.md#batch-deployments). Model deployments always require one data input and produce one data output. Literal inputs aren't supported. However, pipeline component deployments provide a more general construct to build endpoints and allow you to specify any number of inputs (data and literal) and outputs.
283
283
284
284
The following table summarizes the inputs and outputs for batch deployments:
285
285
286
286
| Deployment type | Number of inputs | Supported input types | Number of outputs | Supported output types |
|[Pipeline component deployment](concept-endpoints-batch.md#pipeline-component-deployment)|[0..N]|[Data inputs](#explore-data-inputs) and [literal inputs](#explore-literal-inputs)|[0..N]|[Data outputs](#explore-data-outputs)|
290
290
291
291
> [!TIP]
292
292
> Inputs and outputs are always named. The names serve as keys to identify the data and pass the actual value during invocation. Because model deployments always require one input and output, the name is ignored during invocation. You can assign the name that best describes your use case, such as "sales_estimation."
@@ -297,8 +297,8 @@ Data inputs refer to inputs that point to a location where data is placed. Becau
297
297
298
298
Batch endpoints support reading files located in the following storage options:
299
299
300
-
-[Azure Machine Learning Data Assets](#input-data-from-a-data-asset), including Folder (`uri_folder`) and File (`uri_file`).
301
-
-[Azure Machine Learning Data Stores](#input-data-from-data-stores), including Azure Blob Storage, Azure Data Lake Storage Gen1, and Azure Data Lake Storage Gen2.
300
+
-[Azure Machine Learning Data Assets](#use-input-data-from-data-asset), including Folder (`uri_folder`) and File (`uri_file`).
301
+
-[Azure Machine Learning Data Stores](#use-input-data-from-data-stores), including Azure Blob Storage, Azure Data Lake Storage Gen1, and Azure Data Lake Storage Gen2.
302
302
-[Azure Storage Accounts](#input-data-from-azure-storage-accounts), including Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, and Azure Blob Storage.
303
303
- Local data folders/files (Azure Machine Learning CLI or Azure Machine Learning SDK for Python). However, that operation results in the local data to be uploaded to the default Azure Machine Learning Data Store of the workspace you're working on.
304
304
@@ -325,9 +325,9 @@ Data outputs refer to the location where the results of a batch job should be pl
325
325
326
326
## Create jobs with data inputs
327
327
328
-
The following examples show how to create jobs, taking data inputs from [data assets](#input-data-from-a-data-asset), [data stores](#input-data-from-data-stores), and [Azure Storage Accounts](#input-data-from-azure-storage-accounts).
328
+
The following examples show how to create jobs, taking data inputs from [data assets](#use-input-data-from-data-asset), [data stores](#use-input-data-from-data-stores), and [Azure Storage Accounts](#input-data-from-azure-storage-accounts).
329
329
330
-
### Input data from data asset
330
+
### Use input data from data asset
331
331
332
332
Azure Machine Learning data assets (formerly known as datasets) are supported as inputs for jobs. Follow these steps to run a batch endpoint job by using data stored in a registered data asset in Azure Machine Learning.
333
333
@@ -491,7 +491,7 @@ Azure Machine Learning data assets (formerly known as datasets) are supported as
491
491
Content-Type: application/json
492
492
```
493
493
494
-
### Input data from data stores
494
+
### Use input data from data stores
495
495
496
496
You can directly reference data from Azure Machine Learning registered data stores with batch deployments jobs. In this example, you first upload some data to the default data store in the Azure Machine Learning workspace and then run a batch deployment on it. Follow these steps to run a batch endpoint job using data stored in a data store.
0 commit comments