You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/service/reference-pipeline-yaml.md
+14-2Lines changed: 14 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -114,8 +114,7 @@ Steps define a computational environment, along with the files to run on the env
114
114
| YAML key | Description |
115
115
| ----- | ----- |
116
116
| `script_name` | The name of the U-SQL script (relative to the `source_directory`). |
117
-
| `name` | TBD |
118
-
| `compute_target` | TBD |
117
+
| `compute_target` | The Azure Data Lake compute target to use for this step. |
119
118
| `parameters` | [Parameters](#parameters) to the pipeline. |
120
119
| `inputs` | TBD |
121
120
| `outputs` | TBD |
@@ -130,6 +129,7 @@ Steps define a computational environment, along with the files to run on the env
130
129
131
130
| YAML key | Description |
132
131
| ----- | ----- |
132
+
| `compute_target` | The Azure Batch compute target to use for this step. |
133
133
| `source_directory` | Directory that contains the module binaries, executable, assemblies, etc. |
134
134
| `executable` | Name of the command/executable that will be ran as part of this job. |
135
135
| `create_pool` | Boolean flag to indicate whether to create the pool before running the job. |
@@ -144,6 +144,7 @@ Steps define a computational environment, along with the files to run on the env
144
144
145
145
| YAML key | Description |
146
146
| ----- | ----- |
147
+
| `compute_target` | The Azure Databricks compute target to use for this step. |
147
148
| `run_name` | The name in Databricks for this run. |
148
149
| `source_directory` | Directory that contains the script and other files. |
149
150
| `num_workers` | The static number of workers for the Databricks run cluster. |
@@ -154,6 +155,7 @@ Steps define a computational environment, along with the files to run on the env
154
155
155
156
| YAML key | Description |
156
157
| ----- | ----- |
158
+
| `compute_target` | The Azure Data Factory compute target to use for this step. |
157
159
| `source_data_reference` | Input connection that serves as the source of data transfer operations. Supported values are TBD. |
158
160
| `destination_data_reference` | Input connection that serves as the destination of data transfer operations. Supported values are TBD. |
159
161
| `allow_reuse` | Determines whether the step should reuse previous results when re-run with the same settings. |
@@ -162,7 +164,17 @@ Steps define a computational environment, along with the files to run on the env
162
164
163
165
| YAML key | Description |
164
166
| ----- | ----- |
167
+
| `compute_target` | The compute target to use for this step. The compute target can be an Azure Machine Learning Compute, Virtual Machine (such as the Data Science VM), or HDInsight. |
165
168
| `script_name` | The name of the Python script (relative to `source_directory`). |
166
169
| `source_directory` | Directory that contains the script, Conda environment, etc. |
167
170
| `runconfig` | The path to a `.runconfig` file. This file is a YAML representation of the [RunConfiguration](https://docs.microsoft.com/python/api/azureml-core/azureml.core.runconfiguration?view=azure-ml-py) class. For more information on the structure of this file, see [TBD]. |
168
171
| `allow_reuse` | Determines whether the step should reuse previous results when re-run with the same settings. |
172
+
173
+
## Inputs
174
+
175
+
| YAML key | Description |
176
+
| ----- | ----- |
177
+
| `type` | The type of input. Valid values are `mount` and `download`. |
178
+
| `path_on_compute` | For `download` mode, the local path the step will read the data from. |
179
+
| `overwrite` | For `download` mode, indicates whether to overwrite existing data. |
180
+
| `source` | The data source. This can refer to [Parameters](#parameters)
0 commit comments