Skip to content

Commit 309ffec

Browse files
Merge pull request #224020 from v-lanjunli/addnewfeature
update
2 parents 79e16ae + da2d97e commit 309ffec

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed
Loading

articles/synapse-analytics/quickstart-transform-data-using-spark-job-definition.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,6 +83,7 @@ On this panel, you can reference to the Spark job definition to run.
8383
| Property | Description |
8484
| ----- | ----- |
8585
|Main definition file| The main file used for the job. Select a PY/JAR/ZIP file from your storage. You can select **Upload file** to upload the file to a storage account. <br> Sample: `abfss://…/path/to/wordcount.jar`|
86+
| References from subfolders | Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named "jars", "pyFiles", "files" or "archives" will be scanned, and the folders name are case sensitive. |
8687
|Main class name| The fully qualified identifier or the main class that is in the main definition file. <br> Sample: `WordCount`|
8788
|Command-line arguments| You can add command-line arguments by clicking the **New** button. It should be noted that adding command-line arguments will override the command-line arguments defined by the Spark job definition. <br> *Sample: `abfss://…/path/to/shakespeare.txt` `abfss://…/path/to/result`* <br> |
8889
|Apache Spark pool| You can select Apache Spark pool from the list.|
@@ -92,7 +93,7 @@ On this panel, you can reference to the Spark job definition to run.
9293
|Min executors| Min number of executors to be allocated in the specified Spark pool for the job.|
9394
|Max executors| Max number of executors to be allocated in the specified Spark pool for the job.|
9495
|Driver size| Number of cores and memory to be used for driver given in the specified Apache Spark pool for the job.|
95-
96+
|Spark configuration| Specify values for Spark configuration properties listed in the topic: Spark Configuration - Application properties. Users can use default configuration and customized configuration. |
9697
9798
![spark job definition pipline settings](media/quickstart-transform-data-using-spark-job-definition/spark-job-definition-pipline-settings.png)
9899

0 commit comments

Comments
 (0)