Skip to content

Commit 9900460

Browse files
authored
Merge pull request #107906 from Max-Meng/patch-8
Applied a few changes in the credential section
2 parents e267028 + a6878fb commit 9900460

7 files changed

+31
-17
lines changed

articles/synapse-analytics/sql/develop-storage-files-storage-access-control.md

Lines changed: 24 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -236,25 +236,39 @@ To query a file located in Azure Storage, your serverless SQL pool end point nee
236236
- Server-level CREDENTIAL is used for ad-hoc queries executed using `OPENROWSET` function. Credential name must match the storage URL.
237237
- DATABASE SCOPED CREDENTIAL is used for external tables. External table references `DATA SOURCE` with the credential that should be used to access storage.
238238
239-
To allow a user to create or drop a credential, admin can GRANT/DENY ALTER ANY CREDENTIAL permission to a user:
239+
To allow a user to create or drop a server-level credential, admin can GRANT ALTER ANY CREDENTIAL permission to the user:
240240
241241
```sql
242242
GRANT ALTER ANY CREDENTIAL TO [user_name];
243243
```
244+
To allow a user to create or drop a database scoped credential, admin can GRANT CONTROL permission on the database to the user:
245+
246+
```sql
247+
GRANT CONTROL ON DATABASE::[database_name] TO [user_name];
248+
```
249+
244250

245251
Database users who access external storage must have permission to use credentials.
246252

247253
### Grant permissions to use credential
248254

249-
To use the credential, a user must have `REFERENCES` permission on a specific credential. To grant a `REFERENCES` permission ON a storage_credential for a specific_user, execute:
255+
To use the credential, a user must have `REFERENCES` permission on a specific credential.
256+
257+
To grant a `REFERENCES` permission ON a server-level credential for a specific_user, execute:
250258

251259
```sql
252-
GRANT REFERENCES ON CREDENTIAL::[storage_credential] TO [specific_user];
260+
GRANT REFERENCES ON CREDENTIAL::[server-level_credential] TO [specific_user];
253261
```
254262

255-
## Server-scoped credential
263+
To grant a `REFERENCES` permission ON a DATABASE SCOPED CREDENTIAL for a specific_user, execute:
256264

257-
Server-scoped credentials are used when SQL login calls `OPENROWSET` function without `DATA_SOURCE` to read files on some storage account. The name of server-scoped credential **must** match the base URL of Azure storage (optionally followed by a container name). A credential is added by running [CREATE CREDENTIAL](/sql/t-sql/statements/create-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true). You'll need to provide a CREDENTIAL NAME argument.
265+
```sql
266+
GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[database-scoped_credential] TO [specific_user];
267+
```
268+
269+
## Server-level credential
270+
271+
Server-level credentials are used when SQL login calls `OPENROWSET` function without `DATA_SOURCE` to read files on some storage account. The name of server-level credential **must** match the base URL of Azure storage (optionally followed by a container name). A credential is added by running [CREATE CREDENTIAL](/sql/t-sql/statements/create-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true). You'll need to provide a CREDENTIAL NAME argument.
258272

259273
> [!NOTE]
260274
> The `FOR CRYPTOGRAPHIC PROVIDER` argument is not supported.
@@ -267,7 +281,7 @@ Server-level CREDENTIAL name must match the full path to the storage account (an
267281
| Azure Data Lake Storage Gen1 | https | <storage_account>.azuredatalakestore.net/webhdfs/v1 |
268282
| Azure Data Lake Storage Gen2 | https | <storage_account>.dfs.core.windows.net |
269283

270-
Server-scoped credentials enable access to Azure storage using the following authentication types:
284+
Server-level credentials enable access to Azure storage using the following authentication types:
271285

272286
### [User Identity](#tab/user-identity)
273287

@@ -314,7 +328,7 @@ Optionally, you can use just the base URL of the storage account, without contai
314328

315329
### [Public access](#tab/public-access)
316330

317-
Database scoped credential isn't required to allow access to publicly available files. Create [data source without database scoped credential](develop-tables-external-tables.md?tabs=sql-ondemand#example-for-create-external-data-source) to access publicly available files on Azure storage.
331+
Server-level credential isn't required to allow access to publicly available files. Create [data source without credential](develop-tables-external-tables.md?tabs=sql-ondemand#example-for-create-external-data-source) to access publicly available files on Azure storage.
318332

319333
---
320334

@@ -392,7 +406,7 @@ The database scoped credential doesn't need to match the name of storage account
392406

393407
### [Public access](#tab/public-access)
394408

395-
Database scoped credential isn't required to allow access to publicly available files. Create [data source without database scoped credential](develop-tables-external-tables.md?tabs=sql-ondemand#example-for-create-external-data-source) to access publicly available files on Azure storage.
409+
Database scoped credential isn't required to allow access to publicly available files. Create [data source without credential](develop-tables-external-tables.md?tabs=sql-ondemand#example-for-create-external-data-source) to access publicly available files on Azure storage.
396410

397411
```sql
398412
CREATE EXTERNAL DATA SOURCE mysample
@@ -421,7 +435,7 @@ CREATE EXTERNAL FILE FORMAT [SynapseParquetFormat]
421435
WITH ( FORMAT_TYPE = PARQUET)
422436
GO
423437
CREATE EXTERNAL DATA SOURCE publicData
424-
WITH ( LOCATION = 'https://<storage_account>.dfs.core.windows.net/<public_container>/<path>' )
438+
WITH ( LOCATION = 'https://<storage_account>.dfs.core.windows.net/<public_container>/<path>' )
425439
GO
426440

427441
CREATE EXTERNAL TABLE dbo.userPublicData ( [id] int, [first_name] varchar(8000), [last_name] varchar(8000) )
@@ -468,7 +482,7 @@ CREATE EXTERNAL FILE FORMAT [SynapseParquetFormat] WITH ( FORMAT_TYPE = PARQUET)
468482
GO
469483

470484
CREATE EXTERNAL DATA SOURCE mysample
471-
WITH ( LOCATION = 'https://<storage_account>.dfs.core.windows.net/<container>/<path>'
485+
WITH ( LOCATION = 'https://<storage_account>.dfs.core.windows.net/<container>/<path>'
472486
-- Uncomment one of these options depending on authentication method that you want to use to access data source:
473487
--,CREDENTIAL = WorkspaceIdentity
474488
--,CREDENTIAL = SasCredential

articles/synapse-analytics/sql/overview-features.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -83,12 +83,12 @@ Synapse SQL pools enable you to use built-in security features to secure your da
8383
| **SQL username/password authentication**| Yes | Yes, users can access serverless SQL pool using their usernames and passwords. |
8484
| **Azure Active Directory (Azure AD) authentication**| Yes, Azure AD users | Yes, Azure AD logins and users can access serverless SQL pools using their Azure AD identities. |
8585
| **Storage Azure Active Directory (Azure AD) passthrough authentication** | Yes | Yes, [Azure AD passthrough authentication](develop-storage-files-storage-access-control.md?tabs=user-identity#supported-storage-authorization-types) is applicable to Azure AD logins. The identity of the Azure AD user is passed to the storage if a credential is not specified. Azure AD passthrough authentication is not available for the SQL users. |
86-
| **Storage shared access signature (SAS) token authentication** | No | Yes, using [DATABASE SCOPED CREDENTIAL](/sql/t-sql/statements/create-database-scoped-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true) with [shared access signature token](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#database-scoped-credential) in [EXTERNAL DATA SOURCE](/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true) or instance-level [CREDENTIAL](/sql/t-sql/statements/create-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true) with [shared access signature](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-scoped-credential). |
86+
| **Storage shared access signature (SAS) token authentication** | No | Yes, using [DATABASE SCOPED CREDENTIAL](/sql/t-sql/statements/create-database-scoped-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true) with [shared access signature token](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#database-scoped-credential) in [EXTERNAL DATA SOURCE](/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true) or instance-level [CREDENTIAL](/sql/t-sql/statements/create-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true) with [shared access signature](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-level-credential). |
8787
| **Storage Access Key authentication** | Yes, using [DATABASE SCOPED CREDENTIAL](/sql/t-sql/statements/create-database-scoped-credential-transact-sql?view=azure-sqldw-latest&preserve-view=true) in [EXTERNAL DATA SOURCE](/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true) | No, [use SAS token](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#database-scoped-credential) instead of storage access key. |
8888
| **Storage [Managed Identity](../../data-factory/data-factory-service-identity.md?context=/azure/synapse-analytics/context/context&tabs=synapse-analytics) authentication** | Yes, using [Managed Service Identity Credential](/azure/azure-sql/database/vnet-service-endpoint-rule-overview?preserve-view=true&toc=%2fazure%2fsynapse-analytics%2ftoc.json&view=azure-sqldw-latest&preserve-view=true) | Yes, The query can access the storage using the workspace [Managed Identity](develop-storage-files-storage-access-control.md?tabs=managed-identity#database-scoped-credential) credential. |
8989
| **Storage Application identity/Service principal (SPN) authentication** | [Yes](/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest&preserve-view=true) | Yes, you can create a [credential](develop-storage-files-storage-access-control.md?tabs=service-principal#database-scoped-credential) with a [service principal application ID](develop-storage-files-storage-access-control.md?tabs=service-principal#supported-storage-authorization-types) that will be used to authenticate on the storage. |
9090
| **Server roles** | No | Yes, sysadmin, public, and other server-roles are supported. |
91-
| **SERVER SCOPED CREDENTIAL** | No | Yes, the [server scoped credentials](develop-storage-files-storage-access-control.md?tabs=user-identity#server-scoped-credential) are used by the `OPENROWSET` function that do not uses explicit data source. |
91+
| **SERVER LEVEL CREDENTIAL** | No | Yes, the [server level credentials](develop-storage-files-storage-access-control.md?tabs=user-identity#server-level-credential) are used by the `OPENROWSET` function that do not uses explicit data source. |
9292
| **Permissions - [Server-level](/sql/relational-databases/security/authentication-access/server-level-roles)** | No | Yes, for example, `CONNECT ANY DATABASE` and `SELECT ALL USER SECURABLES` enable a user to read data from any databases. |
9393
| **Database roles** | Yes | Yes, you can use `db_owner`, `db_datareader` and `db_ddladmin` roles. |
9494
| **DATABASE SCOPED CREDENTIAL** | Yes, used in external data sources. | Yes, database scoped credentials can be used in external data sources to [define storage authentication method](develop-storage-files-storage-access-control.md?tabs=user-identity.md#database-scoped-credential). |

articles/synapse-analytics/sql/query-delta-lake-format.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ To improve the performance of your queries, consider specifying explicit types i
6262
> The serverless Synapse SQL pool uses schema inference to automatically determine columns and their types. The rules for schema inference are the same used for Parquet files.
6363
> For Delta Lake type mapping to SQL native type check [type mapping for Parquet](develop-openrowset.md#type-mapping-for-parquet).
6464
65-
Make sure you can access your file. If your file is protected with SAS key or custom Azure identity, you will need to set up a [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-scoped-credential).
65+
Make sure you can access your file. If your file is protected with SAS key or custom Azure identity, you will need to set up a [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-level-credential).
6666

6767
> [!IMPORTANT]
6868
> Ensure you are using a UTF-8 database collation (for example `Latin1_General_100_BIN2_UTF8`) because string values in Delta Lake files are encoded using UTF-8 encoding.

articles/synapse-analytics/sql/query-json-files.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ from openrowset(
5252
) with (doc nvarchar(max)) as rows
5353
```
5454

55-
The JSON document in the preceding sample query includes an array of objects. The query returns each object as a separate row in the result set. Make sure that you can access this file. If your file is protected with SAS key or custom identity, you would need to set up [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-scoped-credential).
55+
The JSON document in the preceding sample query includes an array of objects. The query returns each object as a separate row in the result set. Make sure that you can access this file. If your file is protected with SAS key or custom identity, you would need to set up [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-level-credential).
5656

5757
### Data source usage
5858

articles/synapse-analytics/sql/query-parquet-files.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ from openrowset(
3030
format = 'parquet') as rows
3131
```
3232

33-
Make sure that you can access this file. If your file is protected with SAS key or custom Azure identity, you would need to set up [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-scoped-credential).
33+
Make sure that you can access this file. If your file is protected with SAS key or custom Azure identity, you would need to set up [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-level-credential).
3434

3535
> [!IMPORTANT]
3636
> Ensure you are using a UTF-8 database collation (for example `Latin1_General_100_BIN2_UTF8`) because string values in PARQUET files are encoded using UTF-8 encoding.

articles/synapse-analytics/sql/query-single-csv-file.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ from openrowset(
3939
firstrow = 2 ) as rows
4040
```
4141

42-
Option `firstrow` is used to skip the first row in the CSV file that represents header in this case. Make sure that you can access this file. If your file is protected with SAS key or custom identity, your would need to setup [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-scoped-credential).
42+
Option `firstrow` is used to skip the first row in the CSV file that represents header in this case. Make sure that you can access this file. If your file is protected with SAS key or custom identity, your would need to setup [server level credential for sql login](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-level-credential).
4343

4444
> [!IMPORTANT]
4545
> If your CSV file contains UTF-8 characters, make sure that you are using a UTF-8 database collation (for example `Latin1_General_100_CI_AS_SC_UTF8`).

articles/synapse-analytics/sql/resources-self-help-sql-on-demand.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ If you moved a subscription to another Azure AD tenant, you might experience som
6161

6262
If you get errors while you try to access files in Azure storage, make sure that you have permission to access data. You should be able to access publicly available files. If you try to access data without credentials, make sure that your Azure Active Directory (Azure AD) identity can directly access the files.
6363

64-
If you have a shared access signature key that you should use to access files, make sure that you created a [server-level](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-scoped-credential) or [database-scoped](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#database-scoped-credential) credential that contains that credential. The credentials are required if you need to access data by using the workspace [managed identity](develop-storage-files-storage-access-control.md?tabs=managed-identity#database-scoped-credential) and custom [service principal name (SPN)](develop-storage-files-storage-access-control.md?tabs=service-principal#database-scoped-credential).
64+
If you have a shared access signature key that you should use to access files, make sure that you created a [server-level](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#server-level-credential) or [database-scoped](develop-storage-files-storage-access-control.md?tabs=shared-access-signature#database-scoped-credential) credential that contains that credential. The credentials are required if you need to access data by using the workspace [managed identity](develop-storage-files-storage-access-control.md?tabs=managed-identity#database-scoped-credential) and custom [service principal name (SPN)](develop-storage-files-storage-access-control.md?tabs=service-principal#database-scoped-credential).
6565

6666
### Can't read, list, or access files in Azure Data Lake Storage
6767

0 commit comments

Comments
 (0)