Skip to content

Commit f755818

Browse files
Merge pull request #248400 from v-lanjunli/addcannotchangevariable
add a note
2 parents bb53a97 + 18b94c3 commit f755818

File tree

1 file changed

+16
-11
lines changed

1 file changed

+16
-11
lines changed

articles/synapse-analytics/spark/apache-spark-development-using-notebooks.md

Lines changed: 16 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,11 @@ We provide rich operations to develop notebooks:
5252
+ [Collapse a cell output](#collapse-a-cell-output)
5353
+ [Notebook outline](#notebook-outline)
5454

55+
> [!NOTE]
56+
>
57+
> In the notebooks, there is a SparkSession automatically created for you, stored in a variable called `spark`. Also there is a variable for SparkContext which is called `sc`. Users can access these variables directly and should not change the values of these variables.
58+
59+
5560
<h3 id="add-a-cell">Add a cell</h3>
5661

5762
There are multiple ways to add a new cell to your notebook.
@@ -161,7 +166,7 @@ Select the **Undo** / **Redo** button or press **Z** / **Shift+Z** to revoke the
161166
![Screenshot of Synapse undo cells of aznb](./media/apache-spark-development-using-notebooks/synapse-undo-cells-aznb.png)
162167

163168
Supported undo cell operations:
164-
+ Insert/Delete cell: You could revoke the delete operations by selecting **Undo**, the text content will be kept along with the cell.
169+
+ Insert/Delete cell: You could revoke the delete operations by selecting **Undo**, the text content is kept along with the cell.
165170
+ Reorder cell.
166171
+ Toggle parameter.
167172
+ Convert between Code cell and Markdown cell.
@@ -271,7 +276,7 @@ Select the **Cancel All** button to cancel the running cells or cells waiting in
271276

272277
### Notebook reference
273278

274-
You can use ```%run <notebook path>``` magic command to reference another notebook within current notebook's context. All the variables defined in the reference notebook are available in the current notebook. ```%run``` magic command supports nested calls but not support recursive calls. You will receive an exception if the statement depth is larger than **five**.
279+
You can use ```%run <notebook path>``` magic command to reference another notebook within current notebook's context. All the variables defined in the reference notebook are available in the current notebook. ```%run``` magic command supports nested calls but not support recursive calls. You receive an exception if the statement depth is larger than **five**.
275280

276281
Example:
277282
``` %run /<path>/Notebook1 { "parameterInt": 1, "parameterFloat": 2.5, "parameterBool": true, "parameterString": "abc" } ```.
@@ -289,7 +294,7 @@ Notebook reference works in both interactive mode and Synapse pipeline.
289294

290295
### Variable explorer
291296

292-
Synapse notebook provides a built-in variables explorer for you to see the list of the variables name, type, length, and value in the current Spark session for PySpark (Python) cells. More variables will show up automatically as they are defined in the code cells. Clicking on each column header will sort the variables in the table.
297+
Synapse notebook provides a built-in variables explorer for you to see the list of the variables name, type, length, and value in the current Spark session for PySpark (Python) cells. More variables show up automatically as they are defined in the code cells. Clicking on each column header sorts the variables in the table.
293298

294299
You can select the **Variables** button on the notebook command bar to open or hide the variable explorer.
295300

@@ -382,7 +387,7 @@ Parameterized session configuration allows you to replace the value in %%configu
382387
}
383388
```
384389

385-
Notebook will use default value if run a notebook in interactive mode directly or no parameter that match "activityParameterName" is given from Pipeline Notebook activity.
390+
Notebook uses default value if run a notebook in interactive mode directly or no parameter that match "activityParameterName" is given from Pipeline Notebook activity.
386391

387392
During the pipeline run mode, you can configure pipeline Notebook activity settings as below:
388393
![Screenshot of parameterized session configuration](./media/apache-spark-development-using-notebooks/parameterized-session-config.png)
@@ -450,7 +455,7 @@ You can access data in the primary storage account directly. There's no need to
450455

451456
## IPython Widgets
452457

453-
Widgets are eventful Python objects that have a representation in the browser, often as a control like a slider, textbox etc. IPython Widgets only works in Python environment, it's not supported in other languages (e.g. Scala, SQL, C#) yet.
458+
Widgets are eventful Python objects that have a representation in the browser, often as a control like a slider, textbox etc. IPython Widgets only works in Python environment, it's not supported in other languages (for example, Scala, SQL, C#) yet.
454459

455460
### To use IPython Widget
456461
1. You need to import `ipywidgets` module first to use the Jupyter Widget framework.
@@ -468,11 +473,11 @@ Widgets are eventful Python objects that have a representation in the browser, o
468473
slider
469474
```
470475

471-
3. Run the cell, the widget will display at the output area.
476+
3. Run the cell, the widget displays at the output area.
472477

473478
![Screenshot of ipython widgets slider](./media/apache-spark-development-using-notebooks/ipython-widgets-slider.png)
474479

475-
4. You can use multiple `display()` calls to render the same widget instance multiple times, but they will remain in sync with each other.
480+
4. You can use multiple `display()` calls to render the same widget instance multiple times, but they remain in sync with each other.
476481

477482
```python
478483
slider = widgets.IntSlider()
@@ -514,7 +519,7 @@ Widgets are eventful Python objects that have a representation in the browser, o
514519
|`widgets.jslink()`|You can use `widgets.link()` function to link two similar widgets.|
515520
|`FileUpload` widget| Not support yet.|
516521

517-
2. Global `display` function provided by Synapse does not support displaying multiple widgets in 1 call (i.e. `display(a, b)`), which is different from IPython `display` function.
522+
2. Global `display` function provided by Synapse does not support displaying multiple widgets in one call (that is, `display(a, b)`), which is different from IPython `display` function.
518523
3. If you close a notebook that contains IPython Widget, you will not be able to see or interact with it until you execute the corresponding cell again.
519524

520525

@@ -568,7 +573,7 @@ Available cell magics:
568573
<h2 id="reference-unpublished-notebook">Reference unpublished notebook</h2>
569574

570575

571-
Reference unpublished notebook is helpful when you want to debug "locally", when enabling this feature, notebook run will fetch the current content in web cache, if you run a cell including a reference notebooks statement, you will reference the presenting notebooks in the current notebook browser instead of a saved versions in cluster, that means the changes in your notebook editor can be referenced immediately by other notebooks without having to be published(Live mode) or committed(Git mode), by leveraging this approach you can easily avoid common libraries getting polluted during developing or debugging process.
576+
Reference unpublished notebook is helpful when you want to debug "locally", when enabling this feature, notebook run fetches the current content in web cache, if you run a cell including a reference notebooks statement, you reference the presenting notebooks in the current notebook browser instead of a saved versions in cluster, that means the changes in your notebook editor can be referenced immediately by other notebooks without having to be published(Live mode) or committed(Git mode), by leveraging this approach you can easily avoid common libraries getting polluted during developing or debugging process.
572577

573578
You can enable Reference unpublished notebook from Properties panel:
574579

@@ -607,7 +612,7 @@ You can reuse your notebook sessions conveniently now without having to start ne
607612

608613
![Screenshot of notebook-manage-sessions](./media/apache-spark-development-using-notebooks/synapse-notebook-manage-sessions.png)
609614

610-
In the **Active sessions** list you can see the session information and the corresponding notebook that is currently attached to the session. You can operate Detach with notebook, Stop the session, and View in monitoring from here. Moreover, you can easily connect your selected notebook to an active session in the list started from another notebook, the session will be detached from the previous notebook (if it's not idle) then attach to the current one.
615+
In the **Active sessions**, list you can see the session information and the corresponding notebook that is currently attached to the session. You can operate Detach with notebook, Stop the session, and View in monitoring from here. Moreover, you can easily connect your selected notebook to an active session in the list started from another notebook, the session is detached from the previous notebook (if it's not idle) then attach to the current one.
611616

612617
![Screenshot of notebook-sessions-list](./media/apache-spark-development-using-notebooks/synapse-notebook-sessions-list.png)
613618

@@ -664,7 +669,7 @@ To parameterize your notebook, select the ellipses (...) to access the **more co
664669

665670
---
666671

667-
Azure Data Factory looks for the parameters cell and treats this cell as defaults for the parameters passed in at execution time. The execution engine will add a new cell beneath the parameters cell with input parameters in order to overwrite the default values.
672+
Azure Data Factory looks for the parameters cell and treats this cell as defaults for the parameters passed in at execution time. The execution engine adds a new cell beneath the parameters cell with input parameters in order to overwrite the default values.
668673

669674

670675
### Assign parameters values from a pipeline

0 commit comments

Comments
 (0)