Skip to content

Commit c3e09be

Browse files
authored
Acrolinx fixes
1 parent 52eab64 commit c3e09be

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

articles/machine-learning/how-to-submit-spark-jobs.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ These prerequisites cover the submission of a Spark job from Azure Machine Learn
5050
---
5151

5252
> [!NOTE]
53-
> - For more infomration about resource access while using Azure Machine Learning serverless Spark compute and attached Synapse Spark pool, visit [Ensuring resource access for Spark jobs](apache-spark-environment-configuration.md#ensuring-resource-access-for-spark-jobs).
53+
> - For more information about resource access while using Azure Machine Learning serverless Spark compute and attached Synapse Spark pool, visit [Ensuring resource access for Spark jobs](apache-spark-environment-configuration.md#ensuring-resource-access-for-spark-jobs).
5454
> - Azure Machine Learning provides a [shared quota](how-to-manage-quotas.md#azure-machine-learning-shared-quota) pool, from which all users can access compute quota to perform testing for a limited time. When you use the serverless Spark compute, Azure Machine Learning allows you to access this shared quota for a short time.
5555
5656
### Attach user assigned managed identity using CLI v2
@@ -131,13 +131,11 @@ df.to_csv(args.wrangled_data, index_col="PassengerId")
131131
> [!NOTE]
132132
> This Python code sample uses `pyspark.pandas`. Only the Spark runtime version 3.2 or later supports this.
133133

134-
This script takes two arguments
134+
This script takes two arguments, which pass the path of input data and output folder, respectively:
135135

136136
- `--titanic_data`
137137
- `--wrangled_data`
138138

139-
which pass the path of input data and output folder respectively.
140-
141139
# [Azure CLI](#tab/cli)
142140
[!INCLUDE [cli v2](includes/machine-learning-cli-v2.md)]
143141

0 commit comments

Comments
 (0)