You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/spark/apache-spark-python-package-installation.md
+31-5Lines changed: 31 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,12 +69,38 @@ HDInsight cluster depends on the built-in Python environment, both Python 2.7 an
69
69
70
70
You can search the [package index](https://pypi.python.org/pypi) for the complete list of packages that are available. You can also get a list of available packages from other sources. For example, you can install packages made available through [conda-forge](https://conda-forge.org/feedstocks/).
71
71
72
-
- `seaborn` is the package name that you would like to install.
73
-
- `-n py35new` specify the virtual environment name that just gets created. Make sure to change the name correspondingly based on your virtual environment creation.
72
+
Use below commandif you would like to isntall a library with its latest version:
- `seaborn` is the package name that you would like to install.
77
+
- `-n py35new` specify the virtual environment name that just gets created. Make sure to change the name correspondingly based on your virtual environment creation.
Use below commandif you would like to install a library with a specific version:
89
+
90
+
- Use conda channel:
91
+
92
+
- `numpy=1.16.1` is the package name and version that you would like to install.
93
+
- `-n py35new` specify the virtual environment name that just gets created. Make sure to change the name correspondingly based on your virtual environment creation.
if you don't know the virtual environment name, you can SSH to the head node of the cluster and run `/usr/bin/anaconda/bin/conda info -e` to show all virtual environments.
0 commit comments