You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/synapse-spark-sql-pool-import-export.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ The Spark SQL Analytics Connector is designed to efficiently transfer data betwe
18
18
19
19
Transferring data between Spark pools and SQL pools can be done using JDBC. However, given two distributed systems such as Spark and SQL pools, JDBC tends to be a bottleneck with serial data transfer.
20
20
21
-
The Spark pools to SQL Analytics Connector is a data source implementation for Apache Spark. It uses the Azure Data Lake Storage Gen 2, and Polybase in SQL pools to efficiently transfer data between the Spark cluster and the SQL Analytics instance.
21
+
The Spark pools to SQL Analytics Connector are a data source implementation for Apache Spark. It uses the Azure Data Lake Storage Gen 2, and Polybase in SQL pools to efficiently transfer data between the Spark cluster and the SQL Analytics instance.
The import statements do not need to be provided, they are pre-imported for the notebook experience.
52
+
The import statements are not required, they are pre-imported for the notebook experience.
53
53
54
54
### Transferring data to or from a SQL pool in the Logical Server (DW Instance) attached with the workspace
55
55
@@ -160,29 +160,29 @@ Similarly, in the read scenario, read the data using Scala and write it into a t
160
160
161
161
## Allowing other users to use the DW Connector in your workspace
162
162
163
-
To alter missing permissions for others, you need to be the Storage Blob Data Owner on the ADLS Gen2 storage account connected to the workspace . Please ensure the user has access to the workspace and permissions to run notebooks.
163
+
To alter missing permissions for others, you need to be the Storage Blob Data Owner on the ADLS Gen2 storage account connected to the workspace. Ensure the user has access to the workspace and permissions to run notebooks.
164
164
165
165
### Option 1
166
166
167
167
- Make the user a Storage Blob Data Contributor/Owner
168
168
169
169
### Option 2
170
170
171
-
-Please specify the following ACLs on the folder structure:
171
+
-Specify the following ACLs on the folder structure:
- You should be able to ACL all folders from "synapse" and downward from Azure portal. In order to ACL the root "/" folder, please follow the instructions below.
178
+
- You should be able to ACL all folders from "synapse" and downward from Azure portal. In order to ACL the root "/" folder, follow the instructions below.
179
179
180
-
-Please connect to the storage account connected with the workspace from Storage Explorer using AAD
180
+
-Connect to the storage account connected with the workspace from Storage Explorer using AAD
181
181
- Select your Account and give the ADLS Gen2 URL and default file system for the workspace
182
-
- Once you can see the storage account listed, rightclick on the listing workspace and select "Manage Access"
182
+
- Once you can see the storage account listed, right-click on the listing workspace and select "Manage Access"
183
183
- Add the User to the / folder with "Execute" Access Permission. Select "Ok"
184
184
185
-
**Please make sure you don't select "Default" if you don't intend to**
185
+
**Make sure you don't select "Default" if you don't intend to**
0 commit comments