Skip to content

Commit bbefab7

Browse files
committed
edit pass: apache-spark-azure-portal-add-libraries
1 parent 918c1e1 commit bbefab7

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

articles/synapse-analytics/spark/apache-spark-azure-portal-add-libraries.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ By using the pool management capabilities of Azure Synapse Analytics, you can co
5454

5555
Currently, pool management is supported only for Python. For Python, Azure Synapse Spark pools use Conda to install and manage Python package dependencies.
5656

57-
When you're specifying pool-level libraries, you can now provide a *requirements.txt* or an *environment.yml* file. This environment configuration file is used every time a Spark instance is created from that Spark pool.
57+
When you're specifying pool-level libraries, you can now provide a *requirements.txt* or *environment.yml* file. This environment configuration file is used every time a Spark instance is created from that Spark pool.
5858

5959
To learn more about these capabilities, see [Manage Spark pool packages](./apache-spark-manage-pool-packages.md).
6060

@@ -72,7 +72,7 @@ If you're having trouble identifying required dependencies, follow these steps:
7272
1. Run the following script to set up a local Python environment that's the same as the Azure Synapse Spark environment. The script requires [Synapse-Python38-CPU.yml](https://github.com/Azure-Samples/Synapse/blob/main/Spark/Python/Synapse-Python38-CPU.yml), which is the list of libraries shipped in the default Python environment in Azure Synapse Spark.
7373

7474
```powershell
75-
# One-time synapse Python setup
75+
# One-time Azure Synapse Python setup
7676
wget Synapse-Python38-CPU.yml
7777
sudo bash Miniforge3-Linux-x86_64.sh -b -p /usr/lib/miniforge3
7878
export PATH="/usr/lib/miniforge3/bin:$PATH"
@@ -82,7 +82,7 @@ If you're having trouble identifying required dependencies, follow these steps:
8282
```
8383

8484
1. Run the following script to identify the required dependencies.
85-
The script can be used to pass your *requirement.txt* file, which has all the packages and versions that you intend to install in the Spark 3.1 or Spark 3.2 pool. It will print the names of the *new* wheel files/dependencies for your input library requirements.
85+
The script can be used to pass your *requirements.txt* file, which has all the packages and versions that you intend to install in the Spark 3.1 or Spark 3.2 pool. It will print the names of the *new* wheel files/dependencies for your input library requirements.
8686

8787
```python
8888
# Command to list wheels needed for your input libraries.

0 commit comments

Comments
 (0)