You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/synapse-spark-sql-pool-import-export.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -160,7 +160,7 @@ Similarly, in the read scenario, read the data using Scala and write it into a t
160
160
161
161
## Allowing other users to use the DW Connector in your workspace
162
162
163
-
You need to be Storage Blob Data Owner on the ADLS Gen 2 storage account connected to the workspace to alter missing permissions for others. Please ensure the user has access to the workspace and permissions to run notebooks.
163
+
To alter missing permissions for others, you need to be the Storage Blob Data Owner on the ADLS Gen2 storage account connected to the workspace . Please ensure the user has access to the workspace and permissions to run notebooks.
Copy file name to clipboardExpand all lines: articles/synapse-analytics/sql/connect-overview.md
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,12 +19,13 @@ Get connected to the Synapse SQL capability in Azure Synapse Analytics.
19
19
[Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio) is fully supported starting from version 1.18.0. SSMS is partially supported starting from version 18.5, you can use it to connect and query only.
20
20
21
21
> [!NOTE]
22
-
> If AAD login has connection open for more than 1 hour at time of query execution, any query that relies on AAD will fail. This includes querying storage using AAD pass-through and statements that interact with AAD (like CREATE EXTERNAL PROVIDER). This affects every tool that keeps connection open, like in query editor in SSMS and ADS. Tools that open new connection to execute query are not affected, like Synapse Studio.
22
+
> If an AAD login has a connection open for more than 1 hour at time of query execution, any query that relies on AAD will fail. This includes querying storage using AAD pass-through and statements that interact with AAD (like CREATE EXTERNAL PROVIDER). This affects every tool that keeps connections open, like in query editor in SSMS and ADS. Tools that open new connections to execute a query, like Synapse Studio, are not affected.
23
+
23
24
> You can restart SSMS or connect and disconnect in ADS to mitigate this issue.
24
25
25
26
## Find your server name
26
27
27
-
The server name for SQL Pool in the following example is: showdemoweu.sql.azuresynapse.net.
28
+
The server name for SQL pool in the following example is: showdemoweu.sql.azuresynapse.net.
28
29
The server name for SQL on-demand in the following example is: showdemoweu-ondemand.sql.azuresynapse.net.
# Create and use external tables in SQL on-demand (preview) using Azure Synapse Analytics
15
15
16
16
In this section, you'll learn how to create and use external tables in SQL on-demand (preview). External tables are useful when you want to control access to external data in SQL On-demand and if you want to use tools, such as Power BI, in conjunction with SQL on-demand. External tables can access two types of storage:
17
-
- Public storage where user access public storage files.
18
-
- Protected storage where user access storage files using SAS credential, Azure AD identity, or Managed Identity of Synapse workspace.
17
+
- Public storage where users access public storage files.
18
+
- Protected storage where users access storage files using SAS credential, Azure AD identity, or Managed Identity of Synapse workspace.
19
19
20
20
## Prerequisites
21
21
22
-
Your first step is to create database where the tables will be created and initialize the objects by executing [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql) on that database. This setup script will create the following objects that are used in this sample:
22
+
Your first step is to create a database where the tables will be created. Then initialize the objects by executing [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql) on that database. This setup script will create the following objects that are used in this sample:
23
23
- DATABASE SCOPED CREDENTIAL `sqlondemand` that enables access to SAS-protected `https://sqlondemandstorage.blob.core.windows.net` Azure storage account.
24
+
24
25
```sql
25
26
CREATEDATABASESCOPED CREDENTIAL [sqlondemand]
26
27
WITH IDENTITY='SHARED ACCESS SIGNATURE',
@@ -51,7 +52,7 @@ The queries in this article will be executed on your sample database and use the
51
52
52
53
## Create an external table on protected data
53
54
54
-
You can create external tables that access data on Azure storage account that allows access to users with some Azure AD identity or SAS key. You can create external tables the same way you create regular SQL Server external tables. The query below creates an external table that reads *population.csv* file from SynapseSQL demo Azure storage account that is referenced using `sqlondemanddemo` data source and protected with database scoped credential called `sqlondemand`. Data source and database scoped credential are created in [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql).
55
+
You can create external tables that access data on an Azure storage account that allows access to users with some Azure AD identity or SAS key. You can create external tables the same way you create regular SQL Server external tables. The query below creates an external table that reads *population.csv* file from SynapseSQL demo Azure storage account that is referenced using `sqlondemanddemo` data source and protected with database scoped credential called `sqlondemand`. Data source and database scoped credential are created in [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql).
55
56
56
57
> [!NOTE]
57
58
> Change the first line in the query, i.e., [mydbname], so you're using the database you created.
@@ -76,7 +77,7 @@ GO
76
77
77
78
## Create an external table on public data
78
79
79
-
You can create external tables that reads data from the files placed on publicly available Azure storage. [Setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql) will create public external data source and Parquet file format definition that is used in the following query:
80
+
You can create external tables that read data from the files placed on publicly available Azure storage. This [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql) will create public external data source and Parquet file format definition that is used in the following query:
80
81
81
82
```sql
82
83
CREATE EXTERNAL TABLE Taxi (
@@ -99,7 +100,7 @@ CREATE EXTERNAL TABLE Taxi (
99
100
100
101
You can use external tables in your queries the same way you use them in SQL Server queries.
101
102
102
-
The following query demonstrates using the *population* external table we created in previous section. It returns country names with their population in 2019 in descending order.
103
+
The following query demonstrates this using the *population* external table we created in previous section. It returns country names with their population in 2019 in descending order.
103
104
104
105
> [!NOTE]
105
106
> Change the first line in the query, i.e., [mydbname], so you're using the database you created.
Copy file name to clipboardExpand all lines: articles/synapse-analytics/sql/develop-openrowset.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,23 +13,23 @@ ms.reviewer: jrasnick
13
13
14
14
# How to use OPENROWSET with SQL on-demand (preview)
15
15
16
-
The `OPENROWSET(BULK...)` function allows you to access files in Azure Storage. `OPENROWSET` function reads content of some remote data source (for example file) and returns the content as a set of rows. Within the SQL on-demand (preview) resource, the OPENROWSET bulk rowset provider is accessed by calling the OPENROWSET function and specifying the BULK option.
16
+
The `OPENROWSET(BULK...)` function allows you to access files in Azure Storage. `OPENROWSET` function reads content of a remote data source (for example file) and returns the content as a set of rows. Within the SQL on-demand (preview) resource, the OPENROWSET bulk rowset provider is accessed by calling the OPENROWSET function and specifying the BULK option.
17
17
18
18
The `OPENROWSET` function can be referenced in the `FROM` clause of a query as if it were a table name `OPENROWSET`. It supports bulk operations through a built-in BULK provider that enables data from a file to be read and returned as a rowset.
19
19
20
20
## Data source
21
21
22
-
OPENROWSET function in Synapse SQL reads content of the file(s) from a data source. Data source is Azure storage account and it can be explicitly referenced in `OPENROWSET` function or can be dynamically inferred from URL of the files that you want to read.
23
-
The `OPENROWSET` function can optionally contain `DATA_SOURCE` parameter that specify data source that contains files.
24
-
-`OPENROWSET` without `DATA_SOURCE` can be used to directly read content of the files form the URL location specified as `BULK` option:
22
+
OPENROWSET function in Synapse SQL reads the content of the file(s) from a data source. The data source is an Azure storage account and it can be explicitly referenced in the`OPENROWSET` function or can be dynamically inferred from URL of the files that you want to read.
23
+
The `OPENROWSET` function can optionally contain a `DATA_SOURCE` parameter to specify the data source that contains files.
24
+
-`OPENROWSET` without `DATA_SOURCE` can be used to directly read the contents of the files from the URL location specified as `BULK` option:
25
25
26
26
```sql
27
27
SELECT*
28
28
FROM OPENROWSET(BULK 'http://storage..../container/folder/*.parquet',
29
29
TYPE ='PARQUET') AS file
30
30
```
31
31
32
-
This is quick and easy way to read the content of the files without some pre-configuration. This option enables you to use basic authentication option to access the storage (Azure AD passthrough for Azure AD logins and SAS token for SQL logins).
32
+
This is a quick and easy way to read the content of the files without pre-configuration. This option enables you to use the basic authentication option to access the storage (Azure AD passthrough for Azure AD logins and SAS token for SQL logins).
33
33
34
34
-`OPENROWSET` with `DATA_SOURCE` can be used to access files on specified storage account:
35
35
@@ -40,19 +40,19 @@ FROM OPENROWSET(BULK '/folder/*.parquet',
40
40
TYPE ='PARQUET') AS file
41
41
```
42
42
43
-
This option enables you to configure location of storage account in data source and specify authentication method that should be used to access storage.
43
+
This option enables you to configure location of the storage account in the data source and specify the authentication method that should be used to access storage.
44
44
45
45
> [!IMPORTANT]
46
46
> `OPENROWSET` without `DATA_SOURCE` provides quick and easy way to access the storage files but offers limited authentication options. As an example, Azure AD principal can access files only using their [Azure AD identity](develop-storage-files-storage-access-control.md#user-identity) and cannot access publicly available files. If you need more powerful authentication options, use `DATA_SOURCE` option and define credential that you want to use to access storage.
47
47
48
48
## Security
49
49
50
-
Database user must have `ADMINISTER BULK OPERATIONS` permission to use `OPENROWSET` function.
50
+
A database user must have `ADMINISTER BULK OPERATIONS` permission to use the`OPENROWSET` function.
51
51
52
-
Storage administrator also must enable user to access the files by providing valid SAS token or enabling Azure AD principal to access storage files. Learn more about storage access control in [this article](develop-storage-files-storage-access-control.md).
52
+
The storage administrator must also enable a user to access the files by providing valid SAS token or enabling Azure AD principal to access storage files. Learn more about storage access control in [this article](develop-storage-files-storage-access-control.md).
53
53
54
54
`OPENROWSET` use the following rules to determine how to authenticate to storage:
55
-
- In `OPENROWSET` with `DATA_SOURCE` authentication mechanism depends on caller type.
55
+
- In `OPENROWSET` with `DATA_SOURCE`the authentication mechanism depends on caller type.
56
56
- AAD logins can access files only using their own [Azure AD identity](develop-storage-files-storage-access-control.md#user-identity) if Azure storage allows the Azure AD user to access underlying files (for example, if the caller has Storage Reader permission on storage) and if you [enable Azure AD passthrough authentication](develop-storage-files-storage-access-control.md#force-azure-ad-pass-through) on Synapse SQL service.
57
57
- SQL logins can also use `OPENROWSET` without `DATA_SOURCE` to access publicly available files, files protected using SAS token or Managed Identity of Synapse workspace. You would need to [create server-scoped credential](develop-storage-files-storage-access-control.md#examples) to allow access to storage files.
58
58
- In `OPENROWSET` with `DATA_SOURCE` authentication mechanism is defined in database scoped credential assigned to the referenced data source. This option enables you to access publicly available storage, or access storage using SAS token, Managed Identity of workspace, or [Azure AD identity of caller](develop-storage-files-storage-access-control.md#user-identity) (if caller is Azure AD principal). If `DATA_SOURCE` references Azure storage that is not public, you would need to [create database-scoped credential](develop-storage-files-storage-access-control.md#examples) and reference it in `DATA SOURCE` to allow access to storage files.
@@ -99,9 +99,9 @@ You have two choices for input files that contain the target data for querying.
99
99
100
100
**'unstructured_data_path'**
101
101
102
-
The unstructured_data_path that establishes a path to the data may have absolute or relative path:
103
-
- Absolute path in the format '\<prefix>://\<storage_account_path>/\<storage_path>' enables user to directly read the files.
104
-
- Relative path in the format '<storage_path>' that must be used with `DATA_SOURCE` parameter and describes file pattern within <storage_account_path> location defined in `EXTERNAL DATA SOURCE`.
102
+
The unstructured_data_path that establishes a path to the data may be an absolute or relative path:
103
+
- Absolute path in the format '\<prefix>://\<storage_account_path>/\<storage_path>' enables a user to directly read the files.
104
+
- Relative path in the format '<storage_path>' that must be used with the `DATA_SOURCE` parameter and describes the file pattern within the <storage_account_path> location defined in `EXTERNAL DATA SOURCE`.
105
105
106
106
Below you'll find the relevant <storageaccountpath> values that will link to your particular external data source.
107
107
@@ -229,10 +229,10 @@ FROM
229
229
) AS [r]
230
230
```
231
231
232
-
If you are getting the error saying that the files cannot be listed, you need to enable Synapse SQL on-demand to access public storage:
233
-
- If you are using SQL login you need to [create server-scoped credential that allows access to public storage](develop-storage-files-storage-access-control.md#examples).
234
-
- If you are using Azure AD principal to access public storage, you would need to [create server-scoped credential that allows access to public storage](develop-storage-files-storage-access-control.md#examples) and disable [Azure AD passthrough authentication](develop-storage-files-storage-access-control.md#disable-forcing-azure-ad-pass-through).
232
+
If you are getting an error saying that the files cannot be listed, you need to enable access to public storage in Synapse SQL on-demand:
233
+
- If you are using a SQL login you need to [create server-scoped credential that allows access to public storage](develop-storage-files-storage-access-control.md#examples).
234
+
- If you are using an Azure AD principal to access public storage, you would need to [create server-scoped credential that allows access to public storage](develop-storage-files-storage-access-control.md#examples) and disable [Azure AD passthrough authentication](develop-storage-files-storage-access-control.md#disable-forcing-azure-ad-pass-through).
235
235
236
236
## Next steps
237
237
238
-
For more samples, go to [quickstarts](query-data-storage.md) to learn how to use `OPENROWSET to read [CSV](query-single-csv-file.md), [PARQUET](query-parquet-files.md), and [JSON](query-json-files.md) file formats. You can also learn how to save the results of your query to Azure Storage using [CETAS](develop-tables-cetas.md).
238
+
For more samples, see the [query data storage quickstart](query-data-storage.md) to learn how to use `OPENROWSET to read [CSV](query-single-csv-file.md), [PARQUET](query-parquet-files.md), and [JSON](query-json-files.md) file formats. You can also learn how to save the results of your query to Azure Storage using [CETAS](develop-tables-cetas.md).
0 commit comments