Skip to content

Commit a09b15a

Browse files
committed
Fix typo
python -> Python
1 parent 10bd8c8 commit a09b15a

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

articles/machine-learning/tutorial-pipeline-python-sdk.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -174,7 +174,7 @@ Azure ML pipelines are reusable ML workflows that usually consist of several com
174174

175175
## Create component 1: data prep (using programmatic definition)
176176

177-
Let's start by creating the first component. This component handles the preprocessing of the data. The preprocessing task is performed in the *data_prep.py* python file.
177+
Let's start by creating the first component. This component handles the preprocessing of the data. The preprocessing task is performed in the *data_prep.py* Python file.
178178

179179
First create a source folder for the data_prep component:
180180

@@ -228,9 +228,9 @@ Now that both your components are defined and registered, you can start implemen
228228

229229
Here, you'll use *input data*, *split ratio* and *registered model name* as input variables. Then call the components and connect them via their inputs/outputs identifiers. The outputs of each step can be accessed via the `.outputs` property.
230230

231-
The python functions returned by `load_component()` work as any regular python function that we'll use within a pipeline to call each step.
231+
The Python functions returned by `load_component()` work as any regular Python function that we'll use within a pipeline to call each step.
232232

233-
To code the pipeline, you use a specific `@dsl.pipeline` decorator that identifies the Azure ML pipelines. In the decorator, we can specify the pipeline description and default resources like compute and storage. Like a python function, pipelines can have inputs. You can then create multiple instances of a single pipeline with different inputs.
233+
To code the pipeline, you use a specific `@dsl.pipeline` decorator that identifies the Azure ML pipelines. In the decorator, we can specify the pipeline description and default resources like compute and storage. Like a Python function, pipelines can have inputs. You can then create multiple instances of a single pipeline with different inputs.
234234

235235
Here, we used *input data*, *split ratio* and *registered model name* as input variables. We then call the components and connect them via their inputs/outputs identifiers. The outputs of each step can be accessed via the `.outputs` property.
236236

0 commit comments

Comments
 (0)