Skip to content

Commit dc8fc1a

Browse files
committed
Update for score
1 parent 913dfa9 commit dc8fc1a

File tree

1 file changed

+17
-17
lines changed

1 file changed

+17
-17
lines changed

articles/synapse-analytics/spark/apache-spark-manage-session-packages.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -14,12 +14,12 @@ ms.subservice: spark
1414

1515
In addition to pool level packages, you can also specify session-scoped libraries at the beginning of a notebook session. Session-scoped libraries let you specify and use Python, jar, and R packages within a notebook session.
1616

17-
When using session-scoped libraries, it is important to keep the following points in mind:
17+
When using session-scoped libraries, it's important to keep the following points in mind:
1818

1919
- When you install session-scoped libraries, only the current notebook has access to the specified libraries.
20-
- These libraries will not impact other sessions or jobs using the same Spark pool.
21-
- These libraries are installed on top of the base runtime and pool level libraries, and take thr highest precedence.
22-
- Session-scoped libraries do not persist across sessions.
20+
- These libraries have no impact on other sessions or jobs using the same Spark pool.
21+
- These libraries install on top of the base runtime and pool level libraries, and take the highest precedence.
22+
- Session-scoped libraries don't persist across sessions.
2323

2424
## Session-scoped Python packages
2525

@@ -29,18 +29,18 @@ To specify session-scoped Python packages:
2929

3030
1. Navigate to the selected Spark pool and ensure that you have enabled session-level libraries. You can enable this setting by navigating to the **Manage** > **Apache Spark pool** > **Packages** tab.
3131
:::image type="content" source="./media/apache-spark-azure-portal-add-libraries/enable-session-packages.png" alt-text="Screenshot of enabling session packages." lightbox="./media/apache-spark-azure-portal-add-libraries/enable-session-packages.png":::
32-
2. Once the setting has been applied, you can open a notebook and select **Configure Session**> **Packages**.
32+
2. Once the setting applies, you can open a notebook and select **Configure Session**> **Packages**.
3333
![Screenshot of specifying session packages.](./media/apache-spark-azure-portal-add-libraries/update-session-notebook.png "Update session configuration")
3434
![Screenshot of uploading Yml file.](./media/apache-spark-azure-portal-add-libraries/upload-session-notebook-yml.png)
35-
3. Here, you can upload a Conda *environment.yml* file to install or upgrade packages within a session. Once you start your session, the specified libraries will be installed. Once your session ends, these libraries will no longer be available as they are specific to your session.
35+
3. Here, you can upload a Conda *environment.yml* file to install or upgrade packages within a session.The specified libraries are present once the session starts. These libraries will no longer be available after session ends.
3636

3737
### Manage session-scoped Python packages through *%pip* and *%conda* commands
3838

39-
You can leverage the popular *%pip* and *%conda* commands to install additional 3rd party libraries or your custom libraries during your Apache Spark notebook session. In this section, we will use *%pip* commands to demonstrate several common scenarios.
39+
You can use the popular *%pip* and *%conda* commands to install additional third party libraries or your custom libraries during your Apache Spark notebook session. In this section, we use *%pip* commands to demonstrate several common scenarios.
4040

41-
### Install a third-party package
41+
### Install a third party package
4242

43-
You can easily install an Python library from [PyPI](https://pypi.org/).
43+
You can easily install a Python library from [PyPI](https://pypi.org/).
4444

4545
```python
4646
# Install vega_datasets
@@ -92,7 +92,7 @@ You can use the following command to see what's the built-in version of certain
9292
%pip show pandas
9393
```
9494

95-
The result will be as following:
95+
The result is as following log:
9696

9797
```markdown
9898
Name: pandas
@@ -110,7 +110,7 @@ You can use the following command to switch *pandas* to another version, let's s
110110

111111
### Uninstall a session-scoped library
112112

113-
If you want to uninstall a package which was installed on this notebook session, you may refer to following commands. However, you cannot uninstall the built-in packages.
113+
If you want to uninstall a package, which installed on this notebook session, you may refer to following commands. However, you cannot uninstall the built-in packages.
114114

115115
```python
116116
%pip uninstall altair vega_datasets --yes
@@ -146,7 +146,7 @@ To specify session-scoped Java or Scala packages, you can use the ```%%configure
146146
147147
## Session-scoped R packages (Preview)
148148

149-
Azure Synapse Analytics pools include many popular R libraries out-of-the-box. You can also install additional 3rd party libraries during your Apache Spark notebook session.
149+
Azure Synapse Analytics pools include many popular R libraries out-of-the-box. You can also install extra third party libraries during your Apache Spark notebook session.
150150

151151
> [!NOTE]
152152
>
@@ -162,7 +162,7 @@ You can easily install an R library from [CRAN](https://cran.r-project.org/).
162162
install.packages(c("nycflights13", "Lahman"))
163163
```
164164

165-
You can also leverage CRAN snapshots as the repository to ensure that the same package version is downloaded each time.
165+
You can also use CRAN snapshots as the repository to ensure to download the same package version each time.
166166

167167
```r
168168
install.packages("highcharter", repos = "https://cran.microsoft.com/snapshot/2021-07-16/")
@@ -215,22 +215,22 @@ packageVersion("caesar")
215215

216216
### Remove an R package from a session
217217

218-
You can use the ```detach``` function to remove a library from the namespace. These libraries will stay on disk until they are loaded again.
218+
You can use the ```detach``` function to remove a library from the namespace. These libraries stay on disk until they're loaded again.
219219

220220
```r
221221
# detach a library
222222

223223
detach("package: caesar")
224224
```
225225

226-
To remove a session-scoped package from a notebook, use the ```remove.packages()``` command. This will not impact other sessions on the same cluster. Users cannot uninstall or remove libraries that are installed as part of the default Azure Synapse Analytics runtime.
226+
To remove a session-scoped package from a notebook, use the ```remove.packages()``` command. This library change has no impact on other sessions on the same cluster. Users can't uninstall or remove built-in libraries of the default Azure Synapse Analytics runtime.
227227

228228
```r
229229
remove.packages("caesar")
230230
```
231231

232232
> [!NOTE]
233-
> You cannot remove core packages like SparkR, SparklyR, or R.
233+
> You can't remove core packages like SparkR, SparklyR, or R.
234234
235235
### Session-scoped R libraries and SparkR
236236

@@ -256,7 +256,7 @@ spark.lapply(docs, str_length_function)
256256

257257
### Session-scoped R libraries and SparklyR
258258

259-
With spark_apply() in SparklyR, you can use any R package inside Spark. By default, in sparklyr::spark_apply(), the packages argument is set to FALSE. This copies libraries in the current libPaths to the workers, allowing you to import and use them on workers. For example, you can run the following to generate a caesar-encrypted message with sparklyr::spark_apply():
259+
With spark_apply() in SparklyR, you can use any R package inside Spark. By default, in sparklyr::spark_apply(), the packages argument sets to FALSE. This copies libraries in the current libPaths to the workers, allowing you to import and use them on workers. For example, you can run the following to generate a caesar-encrypted message with sparklyr::spark_apply():
260260

261261
```r
262262
install.packages("caesar", repos = "https://cran.microsoft.com/snapshot/2021-07-16/")

0 commit comments

Comments
 (0)