Skip to content

Commit 8599daf

Browse files
Merge pull request #219391 from v-lanjli/property
add property for python code reference
2 parents e50d7b3 + e06f2a5 commit 8599daf

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/synapse-analytics/quickstart-transform-data-using-spark-job-definition.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ On this panel, you can reference to the Spark job definition to run.
8686
|Main class name| The fully qualified identifier or the main class that is in the main definition file. <br> Sample: `WordCount`|
8787
|Command-line arguments| You can add command-line arguments by clicking the **New** button. It should be noted that adding command-line arguments will override the command-line arguments defined by the Spark job definition. <br> *Sample: `abfss://…/path/to/shakespeare.txt` `abfss://…/path/to/result`* <br> |
8888
|Apache Spark pool| You can select Apache Spark pool from the list.|
89-
|Python code reference| Additional python code files used for reference in the main definition file. |
89+
|Python code reference| Additional python code files used for reference in the main definition file. <br> It supports passing files (.py, .py3, .zip) to the "pyFiles" property. It will override the "pyFiles" property defined in Spark job definition. <br>|
9090
|Reference files | Additional files used for reference in the main definition file. |
9191
|Dynamically allocate executors| This setting maps to the dynamic allocation property in Spark configuration for Spark Application executors allocation.|
9292
|Min executors| Min number of executors to be allocated in the specified Spark pool for the job.|

0 commit comments

Comments
 (0)