Skip to content

Commit 91629a3

Browse files
committed
add optional
1 parent cce341a commit 91629a3

File tree

1 file changed

+9
-9
lines changed

1 file changed

+9
-9
lines changed

articles/synapse-analytics/synapse-notebook-activity.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -27,19 +27,19 @@ You can create a Synapse notebook activity directly from the Synapse pipeline ca
2727

2828
Drag and drop **Synapse notebook** under **Activities** onto the Synapse pipeline canvas. Select on the Synapse notebook activity box and config the notebook content for current activity in the **settings**. You can select an existing notebook from the current workspace or add a new one.
2929

30-
You can also select an Apache Spark pool in the settings. It should be noted that the Apache spark pool set here will replace the Apache spark pool used in the notebook. If Apache spark pool is not selected in the settings of notebook content for current activity, the Apache spark pool selected in that notebook will be used to run.
30+
(Optional) You can also reconfigure Spark pool\Executor size\Dynamically allocate executors\Min executors\Max executors\Driver size in settings. It should be noted that the settings reconfigured here will replace the settings of the Configure session in Notebook. If nothing is set in the settings of the currently activity notebook, it will run with the settings of the configure session in that notebook.
3131

3232
![screenshot-showing-create-notebook-activity](./media/synapse-notebook-activity/create-synapse-notebook-activity.png)
3333

3434

35-
| Property | Description |
36-
| ----- | ----- |
37-
|Spark pool| Reference to the Spark pool. You can select Apache Spark pool from the list. If this setting is empty, it will run in the spark pool of the notebook itself.|
38-
|Executor size| Number of cores and memory to be used for executors allocated in the specified Apache Spark pool for the session.|
39-
|Dynamically allocate executors| This setting maps to the dynamic allocation property in Spark configuration for Spark Application executors allocation.|
40-
|Min executors| Min number of executors to be allocated in the specified Spark pool for the job.|
41-
|Max executors| Max number of executors to be allocated in the specified Spark pool for the job.|
42-
|Driver size| Number of cores and memory to be used for driver given in the specified Apache Spark pool for the job.|
35+
| Property | Description | Required |
36+
| ----- | ----- | ----- |
37+
|Spark pool| Reference to the Spark pool. You can select Apache Spark pool from the list. If this setting is empty, it will run in the spark pool of the notebook itself.| No |
38+
|Executor size| Number of cores and memory to be used for executors allocated in the specified Apache Spark pool for the session.| No |
39+
|Dynamically allocate executors| This setting maps to the dynamic allocation property in Spark configuration for Spark Application executors allocation.| No |
40+
|Min executors| Min number of executors to be allocated in the specified Spark pool for the job.| No |
41+
|Max executors| Max number of executors to be allocated in the specified Spark pool for the job.| No |
42+
|Driver size| Number of cores and memory to be used for driver given in the specified Apache Spark pool for the job.| No |
4343

4444
> [!NOTE]
4545
> The execution of parallel Spark Notebooks in Azure Synapse pipelines be queued and executed in a FIFO manner, jobs order in the queue is according to the time sequence, the expire time of a job in the queue is 3 days, please notice that queue for notebook only work in synapse pipeline.

0 commit comments

Comments
 (0)