Skip to content

Commit a6fd188

Browse files
authored
Merge pull request #115097 from jovanpop-msft/patch-167
Update develop-storage-files-storage-access-control.md
2 parents fe27289 + a52f30c commit a6fd188

File tree

1 file changed

+61
-30
lines changed

1 file changed

+61
-30
lines changed

articles/synapse-analytics/sql/develop-storage-files-storage-access-control.md

Lines changed: 61 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ This article describes the types of credentials you can use and how credential l
2121

2222
## Supported storage authorization types
2323

24-
A user that has logged into a SQL on-demand resource must be authorized to access and query the files in Azure Storage. Three authorization types are supported:
24+
A user that has logged into a SQL on-demand resource must be authorized to access and query the files in Azure Storage if the files are not publicly available. Three authorization types are supported:
2525

2626
- [User Identity](?tabs=user-identity)
2727
- [Shared access signature](?tabs=shared-access-signature)
@@ -42,6 +42,8 @@ You can get an SAS token by navigating to the **Azure portal -> Storage Account
4242
>
4343
> SAS token: ?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-04-18T20:42:12Z&st=2019-04-18T12:42:12Z&spr=https&sig=lQHczNvrk1KoYLCpFdSsMANd0ef9BrIPBNJ3VYEIq78%3D
4444
45+
You need to create database-scoped or server-scoped credential to enable access using SAS token.
46+
4547
### [User Identity](#tab/user-identity)
4648

4749
**User Identity**, also known as "pass-through", is an authorization type where the identity of the Azure AD user that logged into
@@ -94,7 +96,7 @@ Before accessing the data, the Azure Storage administrator must grant permission
9496

9597
### [Anonymous access](#tab/public-access)
9698

97-
You can access publicly available files placed on Azure storage accounts that allow anonymous access.
99+
You can access publicly available files placed on Azure storage accounts that [allow anonymous access](/azure/storage/blobs/storage-manage-access-to-resources.md).
98100

99101
---
100102

@@ -124,30 +126,14 @@ To query a file located in Azure Storage, your SQL on-demand end point needs a c
124126
- Server-level CREDENTIAL is used for ad-hoc queries executed using `OPENROWSET` function. Credential name must match the storage URL.
125127
- DATABASE SCOPED CREDENTIAL is used for external tables. External table references `DATA SOURCE` with the credential that should be used to access storage.
126128

127-
A credential is added by running [CREATE CREDENTIAL](/sql/t-sql/statements/create-credential-transact-sql?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json&view=azure-sqldw-latest). You'll need to provide a CREDENTIAL NAME argument. It must match either part of the path or the whole path to data in Storage (see below).
128-
129-
> [!NOTE]
130-
> The FOR CRYPTOGRAPHIC PROVIDER argument is not supported.
131-
132-
For all supported authorization types, credentials can point to an account or a container.
133-
134-
Server-level CREDENTIAL name must match the full path to the storage account (and optionally container) in the following format: `<prefix>://<storage_account_path>/<storage_path>`
135-
136-
| External Data Source | Prefix | Storage account path |
137-
| -------------------------- | ------ | --------------------------------------------------- |
138-
| Azure Blob Storage | https | <storage_account>.blob.core.windows.net |
139-
| Azure Data Lake Storage Gen1 | https | <storage_account>.azuredatalakestore.net/webhdfs/v1 |
140-
| Azure Data Lake Storage Gen2 | https | <storage_account>.dfs.core.windows.net |
141-
142-
> [!NOTE]
143-
> There is special server-level CREDENTIAL `UserIdentity` that [forces Azure AD pass-through](#force-azure-ad-pass-through).
144-
145-
Optionally, to allow a user to create or drop a credential, admin can GRANT/DENY ALTER ANY CREDENTIAL permission to a user:
129+
To allow a user to create or drop a credential, admin can GRANT/DENY ALTER ANY CREDENTIAL permission to a user:
146130

147131
```sql
148132
GRANT ALTER ANY CREDENTIAL TO [user_name];
149133
```
150134

135+
Database users who access external storage must have permission to use credentials.
136+
151137
### Grant permissions to use credential
152138

153139
To use the credential, a user must have `REFERENCES` permission on a specific credential. To grant a `REFERENCES` permission ON a storage_credential for a specific_user, execute:
@@ -164,11 +150,28 @@ GRANT REFERENCES ON CREDENTIAL::[UserIdentity] TO [public];
164150

165151
## Server-scoped credential
166152

167-
Server-scoped credentials are used when SQL login calls `OPENROWSET` function without `DATA_SOURCE` to read files on some storage account. The name of server-scoped credential **must** match the URL of Azure storage.
153+
Server-scoped credentials are used when SQL login calls `OPENROWSET` function without `DATA_SOURCE` to read files on some storage account. The name of server-scoped credential **must** match the URL of Azure storage. A credential is added by running [CREATE CREDENTIAL](/sql/t-sql/statements/create-credential-transact-sql?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json&view=azure-sqldw-latest). You'll need to provide a CREDENTIAL NAME argument. It must match either part of the path or the whole path to data in Storage (see below).
154+
155+
> [!NOTE]
156+
> The FOR CRYPTOGRAPHIC PROVIDER argument is not supported.
157+
158+
Server-level CREDENTIAL name must match the full path to the storage account (and optionally container) in the following format: `<prefix>://<storage_account_path>/<storage_path>`. Storage account paths are described in the following table:
159+
160+
| External Data Source | Prefix | Storage account path |
161+
| -------------------------- | ------ | --------------------------------------------------- |
162+
| Azure Blob Storage | https | <storage_account>.blob.core.windows.net |
163+
| Azure Data Lake Storage Gen1 | https | <storage_account>.azuredatalakestore.net/webhdfs/v1 |
164+
| Azure Data Lake Storage Gen2 | https | <storage_account>.dfs.core.windows.net |
165+
166+
> [!NOTE]
167+
> There is special server-level CREDENTIAL `UserIdentity` that [forces Azure AD pass-through](?tabs=user-identity#force-azure-ad-pass-through).
168+
169+
Server-scoped credentials enable access to Azure storage using the following authentication types:
168170

169171
### [Shared access signature](#tab/shared-access-signature)
170172

171-
The following script creates a server-level credential that can be used by `OPENROWSET` function to access any file on Azure storage using SAS token. Create this credential to enable SQL principal that executes `OPENROWSET` function to read files protected with SAS key on the Azure storage that matches URL in credential name.
173+
The following script creates a server-level credential that can be used by `OPENROWSET` function to access any file on Azure storage using SAS token. Create this credential to enable SQL principal that executes `OPENROWSET` function to read files protected
174+
with SAS key on the Azure storage that matches URL in credential name.
172175

173176
Exchange <*mystorageaccountname*> with your actual storage account name, and <*mystorageaccountcontainername*> with the actual container name:
174177

@@ -181,7 +184,7 @@ GO
181184

182185
### [User Identity](#tab/user-identity)
183186

184-
The following script creates a server-level credential that enables user to impersonate using his Azure AD identity.
187+
The following script creates a server-level credential that enables user to impersonate using Azure AD identity.
185188

186189
```sql
187190
CREATE CREDENTIAL [UserIdentity]
@@ -215,6 +218,7 @@ GO
215218

216219
Database-scoped credentials are used when any principal calls `OPENROWSET` function with `DATA_SOURCE` or selects data from [external table](develop-tables-external-tables.md) that don't access public files. The database scoped credential doesn't need to match the name of storage account because it will be explicitly used in DATA SOURCE that defines the location of storage.
217220

221+
Database-scoped credentials enable access to Azure storage using the following authentication types:
218222

219223
### [Shared access signature](#tab/shared-access-signature)
220224

@@ -254,11 +258,20 @@ Database scoped credential is not required to allow access to publicly available
254258

255259
---
256260

261+
Database scoped credentials are used in external data sources to specify what authentication method will be used to access this storage:
262+
263+
```sql
264+
CREATE EXTERNAL DATA SOURCE mysample
265+
WITH ( LOCATION = 'https://*******.blob.core.windows.net/samples',
266+
CREDENTIAL = <name of database scoped credential>
267+
)
268+
```
269+
257270
## Examples
258271

259-
**External table that access publicly available data source**
272+
**Accessing publicly available data source**
260273

261-
Use the following script to create a table that access publicly available data source.
274+
Use the following script to create a table that accesses publicly available data source.
262275

263276
```sql
264277
CREATE EXTERNAL FILE FORMAT [SynapseParquetFormat] WITH ( FORMAT_TYPE = PARQUET)
@@ -268,12 +281,21 @@ WITH ( LOCATION = 'https://****.blob.core.windows.net/public-access' )
268281
GO
269282

270283
CREATE EXTERNAL TABLE dbo.userPublicData ( [id] int, [first_name] varchar(8000), [last_name] varchar(8000) )
271-
WITH ( LOCATION = 'parquet/user-data/userdata.parquet', DATA_SOURCE = [publicData], FILE_FORMAT = [SynapseParquetFormat] )
284+
WITH ( LOCATION = 'parquet/user-data/*.parquet', DATA_SOURCE = [publicData], FILE_FORMAT = [SynapseParquetFormat] )
285+
```
286+
287+
Database user can read the content of the files from the data source using external table or [OPENROWSET](develop-openrowset.md) function that references the data source:
288+
289+
```sql
290+
SELECT TOP 10 * FROM dbo.userPublicData;
291+
GO
292+
SELECT TOP 10 * FROM OPENROWSET(BULK 'parquet/user-data/*.parquet', DATA_SOURCE = [mysample], FORMAT=PARQUET) as rows;
293+
GO
272294
```
273295

274-
**External table that access data source using credential**
296+
**Accessing data source using credential**
275297

276-
Modify the following script to create an external table that access Azure storage using SAS token, Azure AD identity of user, or managed identity of workspace.
298+
Modify the following script to create an external table that accesses Azure storage using SAS token, Azure AD identity of user, or managed identity of workspace.
277299

278300
```sql
279301
-- Create master key in databases with some password (one-off per database)
@@ -294,7 +316,7 @@ CREATE EXTERNAL FILE FORMAT [SynapseParquetFormat] WITH ( FORMAT_TYPE = PARQUET)
294316
GO
295317

296318
CREATE EXTERNAL DATA SOURCE mysample
297-
WITH ( LOCATION = 'https://*******.blob.core.windows.net/samples',
319+
WITH ( LOCATION = 'https://*******.blob.core.windows.net/samples'
298320
-- Uncomment one of these options depending on authentication method that you want to use to access data source:
299321
--,CREDENTIAL = MyIdentity
300322
--,CREDENTIAL = WorkspaceIdentity
@@ -306,6 +328,15 @@ WITH ( LOCATION = 'parquet/user-data/*.parquet', DATA_SOURCE = [mysample], FILE_
306328

307329
```
308330

331+
Database user can read the content of the files from the data source using [external table](develop-tables-external-tables.md) or [OPENROWSET](develop-openrowset.md) function that references the data source:
332+
333+
```sql
334+
SELECT TOP 10 * FROM dbo.userdata;
335+
GO
336+
SELECT TOP 10 * FROM OPENROWSET(BULK 'parquet/user-data/*.parquet', DATA_SOURCE = [mysample], FORMAT=PARQUET) as rows;
337+
GO
338+
```
339+
309340
## Next steps
310341

311342
The articles listed below will help you learn how query different folder types, file types, and create and use views:

0 commit comments

Comments
 (0)