You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-create-component-pipeline-python.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,13 +32,13 @@ The example pipeline trains a small [Keras](https://keras.io/) convolutional neu
32
32
In this article, you complete the following tasks:
33
33
34
34
> [!div class="checklist"]
35
-
> * Prepare input data for the pipeline job
36
-
> * Create three components to prepare the data, train an image, and score the model
37
-
> * Build a pipeline from the components
38
-
> * Get access to a workspace with compute
39
-
> * Submit the pipeline job
40
-
> * Review the output of the components and the trained neural network
41
-
> * (Optional) Register the component for further reuse and sharing within the workspace
35
+
> * Prepare input data for the pipeline job.
36
+
> * Create three components to prepare the data, train an image, and score the model.
37
+
> * Build a pipeline from the components.
38
+
> * Get access to a workspace that has compute.
39
+
> * Submit the pipeline job.
40
+
> * Review the output of the components and the trained neural network.
41
+
> * (Optional) Register the component for further reuse and sharing within the workspace.
42
42
43
43
If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://azure.microsoft.com/free/) today.
44
44
@@ -95,11 +95,11 @@ For each component, you need to complete these steps:
95
95
96
96
The next section shows how to create the components in two ways. For the first two components, you use a Python function. For the third component you use YAML definition.
97
97
98
-
### Create the data-preparation component
98
+
### Create the datapreparation component
99
99
100
100
The first component in this pipeline converts the compressed data files of `fashion_ds` into two .csv files, one for training and the other for scoring. You use a Python functionto define this component.
101
101
102
-
If you're following along with the example in the [Azure Machine Learning examples repo](https://github.com/Azure/azureml-examples/tree/main/sdk/python/jobs/pipelines/2e_image_classification_keras_minist_convnet), the source files are already available in the `prep/` folder. This folder contains two files to construct the component: `prep_component.py`, which defines the component, and `conda.yaml`, which defines the runtime environment of the component.
102
+
If you're following along with the example in the [Azure Machine Learning examples repo](https://github.com/Azure/azureml-examples/tree/main/sdk/python/jobs/pipelines/2e_image_classification_keras_minist_convnet), the source files are already available in the `prep` folder. This folder contains two files to construct the component: `prep_component.py`, which defines the component, and `conda.yaml`, which defines the runtime environment of the component.
103
103
104
104
#### Define component by using a Python function
105
105
@@ -130,9 +130,9 @@ This is what a component looks like in the studio UI:
130
130
131
131
:::image type="content" source="./media/how-to-create-component-pipeline-python/prep-data-component.png" alt-text="Screenshot of the Prep Data component in the UI and code." lightbox ="./media/how-to-create-component-pipeline-python/prep-data-component.png":::
132
132
133
-
Now, you've prepared all source files for the `Prep Data` component.
133
+
You've now prepared all source files for the `Prep Data` component.
134
134
135
-
### Create the train model component
135
+
### Create the model training component
136
136
137
137
In this section, you'll create a component for training the image classification model in a Python function, as you did with the `Prep Data` component.
138
138
@@ -144,7 +144,7 @@ The source files for this component are in the `train` folder in the [Azure Mach
144
144
*`train_component.py` defines the interface of the component and imports the functionthat'sin`train.py`.
145
145
*`conda.yaml` defines the runtime environment of the component.
146
146
147
-
#### Get a script that contains execution logic
147
+
#### Get a script that contains the logic
148
148
149
149
The `train.py` file contains a normal Python functionthat performs the logic for training a Keras neural network for image classification. To view the code, see the [train.py file on GitHub](https://github.com/Azure/azureml-examples/tree/main/sdk/python/jobs/pipelines/2e_image_classification_keras_minist_convnet/train/train.py).
150
150
@@ -181,7 +181,7 @@ The `score.py` file contains a normal Python function that performs the training
The code in score.py takes three command-line arguments: `input_data`, `input_model`, and `output_result`. The program scores the input model by using input data and then outputs the result.
184
+
The code in`score.py` takes three command-line arguments: `input_data`, `input_model`, and `output_result`. The program scores the input model by using input data and then outputs the result.
185
185
186
186
#### Define the component via YAML
187
187
@@ -199,7 +199,7 @@ In this section, you'll learn how to create a component specification in the val
199
199
* The `command` section specifies the command to execute when the component runs.
200
200
* The `environment` section contains a Docker image and a conda YAML file. The source file is in the [sample repository](https://github.com/Azure/azureml-examples/blob/v2samplesreorg/sdk/python/jobs/pipelines/2e_image_classification_keras_minist_convnet/score/conda.yaml).
201
201
202
-
You now have all the source files for model scoring component.
202
+
You now have all the source files for the model scoring component.
203
203
204
204
## Load the components to build a pipeline
205
205
@@ -265,7 +265,7 @@ Create an `MLClient` object to manage Azure Machine Learning services. If you us
265
265
266
266
#### Submit the pipeline job to the workspace
267
267
268
-
Now you have a handle to your workspace, you can submit your pipeline job:
268
+
Now that you have a handle to your workspace, you can submit your pipeline job:
@@ -284,7 +284,7 @@ You can monitor the pipeline run by selecting the link. Or you can block it unti
284
284
> [!IMPORTANT]
285
285
> The first pipeline run takes about 15 minutes. All dependencies are downloaded, a Docker image is created, and the Python environment is provisioned and created. Running the pipeline again takes significantly less time because those resources are reused instead of created. However, total runtime for the pipeline depends on the workload of your scripts and the processes that run in each pipeline step.
286
286
287
-
### Checkout outputs and debug your pipeline in the UI
287
+
### Check outputs and debug your pipeline in the UI
288
288
289
289
You can select the `Link to Azure Machine Learning studio`, which is the job detail page of your pipeline. You'll see the pipeline graph:
0 commit comments