You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/spark/apache-spark-jupyter-notebook-install-locally.md
+9-5Lines changed: 9 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ ms.reviewer: jasonh
7
7
ms.service: hdinsight
8
8
ms.custom: hdinsightactive
9
9
ms.topic: conceptual
10
-
ms.date: 06/06/2019
10
+
ms.date: 11/07/2019
11
11
---
12
12
13
13
# Install Jupyter notebook on your computer and connect to Apache Spark on HDInsight
@@ -25,9 +25,9 @@ For more information about the custom kernels and the Spark magic available for
25
25
26
26
## Prerequisites
27
27
28
-
The prerequisites listed here are not for installing Jupyter. These are for connecting the Jupyter notebook to an HDInsight cluster once the notebook is installed.
28
+
* An Apache Spark cluster on HDInsight. For instructions, see [Create Apache Spark clusters in Azure HDInsight](apache-spark-jupyter-spark-sql.md). This is a prerequisite for connecting the Jupyter notebook to an HDInsight cluster once the notebook is installed.
29
29
30
-
*An Apache Spark cluster on HDInsight. For instructions, see [Create Apache Spark clusters in Azure HDInsight](apache-spark-jupyter-spark-sql.md).
30
+
*Familiarity with using Jupyter Notebooks with Spark on HDInsight.
31
31
32
32
## Install Jupyter notebook on your computer
33
33
@@ -41,7 +41,7 @@ Download the [Anaconda installer](https://www.anaconda.com/download/) for your p
41
41
42
42
|Cluster version | Install command |
43
43
|---|---|
44
-
|v3.6 and v3.5 |`pip install sparkmagic==0.12.7`|
44
+
|v3.6 and v3.5 |`pip install sparkmagic==0.13.1`|
45
45
|v3.4|`pip install sparkmagic==0.2.3`|
46
46
47
47
1. Ensure `ipywidgets` is properly installed by running the following command:
@@ -111,6 +111,10 @@ In this section, you configure the Spark magic that you installed earlier to con
@@ -165,7 +169,7 @@ There can be a number of reasons why you might want to install Jupyter on your c
165
169
* With the notebooks available locally, you can connect to different Spark clusters based on your application requirement.
166
170
* You can use GitHub to implement a source control system and have version control for the notebooks. You can also have a collaborative environment where multiple users can work with the same notebook.
167
171
* You can work with notebooks locally without even having a cluster up. You only need a cluster to test your notebooks against, not to manually manage your notebooks or a development environment.
168
-
* It may be easier to configure your own local development environment than it is to configure the Jupyter installation on the cluster. You can take advantage of all the software you have installed locally without configuring one or more remote clusters.
172
+
* It may be easier to configure your own local development environment than it's to configure the Jupyter installation on the cluster. You can take advantage of all the software you've installed locally without configuring one or more remote clusters.
169
173
170
174
> [!WARNING]
171
175
> With Jupyter installed on your local computer, multiple users can run the same notebook on the same Spark cluster at the same time. In such a situation, multiple Livy sessions are created. If you run into an issue and want to debug that, it will be a complex task to track which Livy session belongs to which user.
0 commit comments