Skip to content

Commit ec63f47

Browse files
Merge pull request #228840 from jess-hu-340/0228-add-sas-auth
[Supportability] Add SAS auth to ADLS Gen2 doc
2 parents 5922b7c + 4a0e951 commit ec63f47

File tree

1 file changed

+80
-1
lines changed

1 file changed

+80
-1
lines changed

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 80 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 09/01/2022
11+
ms.date: 02/28/2023
1212
---
1313

1414
# Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory or Azure Synapse Analytics
@@ -81,6 +81,7 @@ The following sections provide information about properties that are used to def
8181
The Azure Data Lake Storage Gen2 connector supports the following authentication types. See the corresponding sections for details:
8282

8383
- [Account key authentication](#account-key-authentication)
84+
- [Shared access signature authentication](#shared-access-signature-authentication)
8485
- [Service principal authentication](#service-principal-authentication)
8586
- [System-assigned managed identity authentication](#managed-identity)
8687
- [User-assigned managed identity authentication](#user-assigned-managed-identity-authentication)
@@ -124,6 +125,84 @@ To use storage account key authentication, the following properties are supporte
124125
}
125126
}
126127
```
128+
### Shared access signature authentication
129+
130+
A shared access signature provides delegated access to resources in your storage account. You can use a shared access signature to grant a client limited permissions to objects in your storage account for a specified time.
131+
132+
You don't have to share your account access keys. The shared access signature is a URI that encompasses in its query parameters all the information necessary for authenticated access to a storage resource. To access storage resources with the shared access signature, the client only needs to pass in the shared access signature to the appropriate constructor or method.
133+
134+
For more information about shared access signatures, see [Shared access signatures: Understand the shared access signature model](../storage/common/storage-sas-overview.md).
135+
136+
> [!NOTE]
137+
>- The service now supports both *service shared access signatures* and *account shared access signatures*. For more information about shared access signatures, see [Grant limited access to Azure Storage resources using shared access signatures](../storage/common/storage-sas-overview.md).
138+
>- In later dataset configurations, the folder path is the absolute path starting from the container level. You need to configure one aligned with the path in your SAS URI.
139+
140+
The following properties are supported for using shared access signature authentication:
141+
142+
| Property | Description | Required |
143+
|:--- |:--- |:--- |
144+
| type | The `type` property must be set to `AzureBlobFS` (suggested)| Yes |
145+
| sasUri | Specify the shared access signature URI to the Storage resources such as blob or container. <br/>Mark this field as `SecureString` to store it securely. You can also put the SAS token in Azure Key Vault to use auto-rotation and remove the token portion. For more information, see the following samples and [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md). | Yes |
146+
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
147+
148+
>[!NOTE]
149+
>If you're using the `AzureStorage` type linked service, it's still supported as is. But we suggest that you use the new `AzureDataLakeStorageGen2` linked service type going forward.
150+
151+
**Example:**
152+
153+
```json
154+
{
155+
"name": "AzureDataLakeStorageGen2LinkedService",
156+
"properties": {
157+
"type": "AzureBlobFS",
158+
"typeProperties": {
159+
"sasUri": {
160+
"type": "SecureString",
161+
"value": "<SAS URI of the Azure Storage resource e.g. https://<accountname>.blob.core.windows.net/?sv=<storage version>&st=<start time>&se=<expire time>&sr=<resource>&sp=<permissions>&sip=<ip range>&spr=<protocol>&sig=<signature>>"
162+
}
163+
},
164+
"connectVia": {
165+
"referenceName": "<name of Integration Runtime>",
166+
"type": "IntegrationRuntimeReference"
167+
}
168+
}
169+
}
170+
```
171+
172+
**Example: store the account key in Azure Key Vault**
173+
174+
```json
175+
{
176+
"name": "AzureDataLakeStorageGen2LinkedService",
177+
"properties": {
178+
"type": "AzureBlobFS",
179+
"typeProperties": {
180+
"sasUri": {
181+
"type": "SecureString",
182+
"value": "<SAS URI of the Azure Storage resource without token e.g. https://<accountname>.blob.core.windows.net/>"
183+
},
184+
"sasToken": {
185+
"type": "AzureKeyVaultSecret",
186+
"store": {
187+
"referenceName": "<Azure Key Vault linked service name>",
188+
"type": "LinkedServiceReference"
189+
},
190+
"secretName": "<secretName with value of SAS token e.g. ?sv=<storage version>&st=<start time>&se=<expire time>&sr=<resource>&sp=<permissions>&sip=<ip range>&spr=<protocol>&sig=<signature>>"
191+
}
192+
},
193+
"connectVia": {
194+
"referenceName": "<name of Integration Runtime>",
195+
"type": "IntegrationRuntimeReference"
196+
}
197+
}
198+
}
199+
```
200+
201+
When you create a shared access signature URI, consider the following points:
202+
203+
- Set appropriate read/write permissions on objects based on how the linked service (read, write, read/write) is used.
204+
- Set **Expiry time** appropriately. Make sure that the access to Storage objects doesn't expire within the active period of the pipeline.
205+
- The URI should be created at the right container or blob based on the need. A shared access signature URI to a blob allows the data factory or Synapse pipeline to access that particular blob. A shared access signature URI to a Blob storage container allows the data factory or Synapse pipeline to iterate through blobs in that container. To provide access to more or fewer objects later, or to update the shared access signature URI, remember to update the linked service with the new URI.
127206

128207
### Service principal authentication
129208

0 commit comments

Comments
 (0)