You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-data-prep-synapse-spark-pool.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -96,7 +96,7 @@ env.register(workspace=ws)
96
96
To begin data preparation with the Apache Spark pool and your custom environment, specify the Apache Spark pool name and which environment to use during the Apache Spark session. Furthermore, you can provide your subscription ID, the machine learning workspace resource group, and the name of the machine learning workspace.
97
97
98
98
>[!IMPORTANT]
99
-
> Make sure to [Allow session level packages](../synapse-analytics/spark/apache-spark-manage-python-packages.md#session-scoped-packages) is enabled in the linked Synapse workspace.
99
+
> Make sure to [Allow session level packages](../synapse-analytics/spark/apache-spark-manage-session-packages.md#session-scoped-python-packages) is enabled in the linked Synapse workspace.
For more infomation about `run_config.spark.configuration`and general Spark configuration, see [SparkConfiguration Class](/python/api/azureml-core/azureml.core.runconfig.sparkconfiguration) and [Apache Spark's configuration documentation](https://spark.apache.org/docs/latest/configuration.html).
304
+
For more information about `run_config.spark.configuration`and general Spark configuration, see [SparkConfiguration Class](/python/api/azureml-core/azureml.core.runconfig.sparkconfiguration) and [Apache Spark's configuration documentation](https://spark.apache.org/docs/latest/configuration.html).
305
305
306
306
Once your `ScriptRunConfig`objectisset up, you can submit the run.
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/apache-spark-azure-portal-add-libraries.md
+25-11Lines changed: 25 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,9 +4,8 @@ description: Learn how to add and manage libraries used by Apache Spark in Azure
4
4
author: shuaijunye
5
5
ms.service: synapse-analytics
6
6
ms.topic: how-to
7
-
ms.date: 06/08/2022
7
+
ms.date: 07/07/2022
8
8
ms.author: shuaijunye
9
-
ms.reviewer: sngun
10
9
ms.subservice: spark
11
10
ms.custom: kr2b-contr-experiment
12
11
---
@@ -23,23 +22,29 @@ You might need to update your serverless Apache Spark pool environment for vario
23
22
- Your team has built a custom package that you need available in your Apache Spark pool.
24
23
25
24
To make third party or locally built code available to your applications, install a library onto one of your serverless Apache Spark pools or notebook session.
25
+
26
+
> [!IMPORTANT]
27
+
>
28
+
> - There are three levels of package installing on Synapse Analytics -- default level, Spark pool level and session level.
29
+
> - Apache Spark in Azure Synapse Analytics has a full Anaconda install plus extra libraries served as the default level installation which is fully managed by Synapse. The Spark pool level packages can be used by all running Artifacts, e.g., Notebook and Spark job definition attaching the corresponding Spark pool. The session level installation will create an environment for the specific Notebook session, the change of session level libraries will not be persisted between sessions.
30
+
> - You can upload custom libraries and a specific version of an open-source library that you would like to use in your Azure Synapse Analytics Workspace. The workspace packages can be installed in your Spark pools.
31
+
> - To be noted, the pool level library management can take certain amount of time depending on the size of packages and the complexity of required dependencies. The session level installation is suggested with experimental and quick iterative scenarios.
26
32
27
33
## Default Installation
28
34
29
-
Apache Spark in Azure Synapse Analytics has a full Anaconda install plus extra libraries. The full libraries list can be found at [Apache Spark version support](apache-spark-version-support.md).
35
+
Default packages include a full Anaconda install plus extra commonly used libraries. The full libraries list can be found at [Apache Spark version support](apache-spark-version-support.md).
30
36
31
37
When a Spark instance starts, these libraries are included automatically. More packages can be added at the Spark pool level or session level.
32
38
33
39
## Workspace packages
34
40
35
41
When your team develops custom applications or models, you might develop various code artifacts like *.whl* or *.jar* files to package your code.
36
42
37
-
In Synapse, workspace packages can be custom or private *.whl* or *.jar* files. You can upload these packages to your workspace and later assign them to a specific Spark pool. Once assigned, these workspace packages are installed automatically on all Spark pool sessions.
43
+
In Synapse, workspace packages can be custom or private *.whl* or *.jar* files. You can upload these packages to your workspace and later assign them to a specific serverless Apache Spark pool. Once assigned, these workspace packages are installed automatically on all Spark pool sessions.
38
44
39
-
To learn more about how to manage workspace libraries, see the following articles:
45
+
To learn more about how to manage workspace libraries, see the following article:
40
46
41
-
-[Python workspace packages: ](./apache-spark-manage-python-packages.md#install-wheel-files) Upload Python *.whl* files as a workspace package and later add these packages to specific serverless Apache Spark pools.
42
-
-[Scala/Java workspace packages: ](./apache-spark-manage-scala-packages.md#workspace-packages) Upload Scala and Java *.jar* files as a workspace package and later add these packages to specific serverless Apache Spark pools.
@@ -49,7 +54,7 @@ Using the Azure Synapse Analytics pool management capabilities, you can configur
49
54
50
55
Currently, pool management is only supported for Python. For Python, Synapse Spark pools use Conda to install and manage Python package dependencies. When specifying your pool-level libraries, you can now provide a *requirements.txt* or an *environment.yml* file. This environment configuration file is used every time a Spark instance is created from that Spark pool.
51
56
52
-
To learn more about these capabilities, see [Python pool management](./apache-spark-manage-python-packages.md#pool-libraries).
57
+
To learn more about these capabilities, see [Manage Spark pool packages](./apache-spark-manage-pool-packages.md).
53
58
54
59
> [!IMPORTANT]
55
60
>
@@ -65,9 +70,18 @@ Session-scoped packages allow users to define package dependencies at the start
65
70
66
71
To learn more about how to manage session-scoped packages, see the following articles:
67
72
68
-
-[Python session packages: ](./apache-spark-manage-python-packages.md) At the start of a session, provide a Conda *environment.yml* to install more Python packages from popular repositories.
69
-
-[Scala/Java session packages: ](./apache-spark-manage-scala-packages.md) At the start of your session, provide a list of *.jar* files to install using `%%configure`.
73
+
-[Python session packages: ](./apache-spark-manage-session-packages.md#session-scoped-python-packages) At the start of a session, provide a Conda *environment.yml* to install more Python packages from popular repositories.
74
+
-[Scala/Java session packages: ](./apache-spark-manage-session-packages.md#session-scoped-java-or-scala-packages) At the start of your session, provide a list of *.jar* files to install using `%%configure`.
70
75
71
-
## Next steps
76
+
## Manage your packages outside Synapse Analytics UI
72
77
78
+
If your team want to manage the libraries without visiting the package management UIs, you have the options to manage the workspace packages and pool level package updates through Azure PowerShell cmdlets or REST APIs for Synapse Analytics.
79
+
80
+
To learn more about Azure PowerShell cmdlets and package management REST APIs, see the following articles:
81
+
82
+
- Azure PowerShell cmdlets for Synapse Analytics: [Manage your Spark pool libraries through Azure PowerShell cmdlets](apache-spark-manage-packages-outside-ui.md#manage-packages-through-azure-powershell-cmdlets)
83
+
- Package management REST APIs: [Manage your Spark pool libraries through REST APIs](apache-spark-manage-packages-outside-ui.md#manage-packages-through-rest-apis)
84
+
85
+
## Next steps
73
86
- View the default libraries: [Apache Spark version support](apache-spark-version-support.md)
Now that you've verified your custom channel, you can use the [Python pool management](./apache-spark-manage-python-packages.md) process to update the libraries on your Apache Spark pool.
127
+
Now that you've verified your custom channel, you can use the [Python pool management](./apache-spark-manage-pool-packages.md#manage-packages-from-synapse-studio-or-azure-portal) process to update the libraries on your Apache Spark pool.
129
128
130
129
## Next steps
131
130
- View the default libraries: [Apache Spark version support](apache-spark-version-support.md)
title: Manage packages outside Synapse Analytics Studio UIs
3
+
description: Learn how to manage packages using Azure PowerShell cmdlets or REST APIs
4
+
author: shuaijunye
5
+
ms.service: synapse-analytics
6
+
ms.topic: conceptual
7
+
ms.date: 07/07/2022
8
+
ms.author: shuaijunye
9
+
ms.subservice: spark
10
+
---
11
+
12
+
# Manage packages outside Synapse Analytics Studio UIs
13
+
14
+
You may want to manage your libraries for your serverless Apache Spark pools without going into the Synapse Analytics UI pages. For example, you may find that:
15
+
16
+
- you develop a custom package and want to upload it to your workspace and use it in your Spark pool. And you want to finish the steps on your local tools without visiting the package management UIs.
17
+
- you are updating your packages through the CI/CD process
18
+
19
+
In this article, we'll provide a general guide to help you managing libraries through Azure PowerShell cmdlets or REST APIs.
20
+
21
+
## Manage packages through Azure PowerShell cmdlets
22
+
23
+
### Add new libraries
24
+
1.[New-AzSynapseWorkspacePackage](https://docs.microsoft.com/powershell/module/az.synapse/new-azsynapseworkspacepackage) command can be used to **upload new libraries to workspace**.
2. The combination of [New-AzSynapseWorkspacePackage](https://docs.microsoft.com/powershell/module/az.synapse/new-azsynapseworkspacepackage) and [Update-AzSynapseSparkPool](https://docs.microsoft.com/powershell/module/az.synapse/update-azsynapsesparkpool) commands can be used to **upload new libraries to workspace** and **attach the library to a Spark pool**.
3. If you want to attach an **existing workspace library** to your Spark pool, please refer to the command combination of [Get-AzSynapseWorkspacePackage](https://docs.microsoft.com/powershell/module/az.synapse/get-azsynapseworkspacepackage) and [Update-AzSynapseSparkPool](https://docs.microsoft.com/powershell/module/az.synapse/update-azsynapsesparkpool).
1. In order to **remove a installed package** from your Spark pool, please refer to the command combination of [Get-AzSynapseWorkspacePackage](https://docs.microsoft.com/powershell/module/az.synapse/get-azsynapseworkspacepackage) and [Update-AzSynapseSparkPool](https://docs.microsoft.com/powershell/module/az.synapse/update-azsynapsesparkpool).
2. You can also retrieve a Spark pool and **remove all attached workspace libraries** from the pool by calling [Get-AzSynapseSparkPool](https://docs.microsoft.com/powershell/module/az.synapse/get-azsynapsesparkpool) and [Update-AzSynapseSparkPool](https://docs.microsoft.com/powershell/module/az.synapse/update-azsynapsesparkpool) commands.
For more Azure PowerShell cmdlets capabilities, please refer to [Azure PowerShell cmdlets for Azure Synapse Analytics](https://docs.microsoft.com/powershell/module/az.synapse).
59
+
60
+
61
+
## Manage packages through REST APIs
62
+
63
+
### Manage the workspace packages
64
+
With the ability of REST APIs, you can add/delete packages or list all uploaded files of your workspace. See the full supported APIs, please refer to [Overview of workspace library APIs](https://docs.microsoft.com/rest/api/synapse/data-plane/library).
65
+
66
+
67
+
### Manage the Spark pool packages
68
+
You can leverage the [Spark pool REST API](https://docs.microsoft.com/rest/api/synapse/big-data-pools/create-or-update) to attach or remove your custom or open source libraries to your Spark pools.
69
+
70
+
1. For custom libraries, please specify the list of custom files as the **customLibraries** property in request body.
2. You can also update your Spark pool libraries by specifying the **libraryRequirements** property in request body.
85
+
86
+
```json
87
+
"libraryRequirements": {
88
+
"content": "",
89
+
"filename": "requirements.txt"
90
+
}
91
+
```
92
+
93
+
## Next steps
94
+
- View the default libraries: [Apache Spark version support](apache-spark-version-support.md)
95
+
- Manage Spark pool level packages through Synapse Studio portal: [Python package management on Notebook Session](./apache-spark-manage-session-packages.md#session-scoped-python-packages)
0 commit comments