Skip to content

Commit 12b04d7

Browse files
authored
Merge pull request #228629 from jess-hu-340/0227-add-anonymous-auth
[Supportability] Update anonymous auth to blob doc
2 parents db86995 + 352af12 commit 12b04d7

File tree

2 files changed

+40
-1
lines changed

2 files changed

+40
-1
lines changed

articles/data-factory/connector-azure-blob-storage.md

Lines changed: 40 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,6 +80,7 @@ The following sections provide details about properties that are used to define
8080

8181
This Blob storage connector supports the following authentication types. See the corresponding sections for details.
8282

83+
- [Anonymous authentication](#anonymous-authentication)
8384
- [Account key authentication](#account-key-authentication)
8485
- [Shared access signature authentication](#shared-access-signature-authentication)
8586
- [Service principal authentication](#service-principal-authentication)
@@ -93,6 +94,44 @@ This Blob storage connector supports the following authentication types. See the
9394
>[!NOTE]
9495
>Azure HDInsight and Azure Machine Learning activities only support authentication that uses Azure Blob Storage account keys.
9596
97+
### Anonymous authentication
98+
99+
The following properties are supported for storage account key authentication in Azure Data Factory or Synapse pipelines:
100+
101+
| Property | Description | Required |
102+
|:--- |:--- |:--- |
103+
| type | The `type` property must be set to `AzureBlobStorage` (suggested) or `AzureStorage` (see the following notes). | Yes |
104+
| containerUri | Specify the Azure Blob container URI which has enabled Anonymous read access by taking this format `https://<AccountName>.blob.core.windows.net/<ContainerName>` and [Configure anonymous public read access for containers and blobs](/azure/storage/blobs/anonymous-read-access-configure#set-the-public-access-level-for-a-container) | Yes |
105+
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
106+
107+
**Example:**
108+
109+
```json
110+
111+
{
112+
"name": "AzureBlobStorageAnonymous",
113+
"properties": {
114+
"annotations": [],
115+
"type": "AzureBlobStorage",
116+
"typeProperties": {
117+
"containerUri": "https:// <accountname>.blob.core.windows.net/ <containername>",
118+
"authenticationType": "Anonymous"
119+
},
120+
"connectVia": {
121+
"referenceName": "<name of Integration Runtime>",
122+
"type": "IntegrationRuntimeReference"
123+
}
124+
}
125+
}
126+
```
127+
128+
**Examples UI**:
129+
130+
The UI experience will be like below. This sample will use the Azure open dataset as the source. If you want to get the open [dataset bing_covid-19_data.csv](https://pandemicdatalake.blob.core.windows.net/public/curated/covid-19/bing_covid-19_data/latest/bing_covid-19_data.csv), you just need to choose **Authentication type** as **Anonymous** and fill in Container URI with `https://pandemicdatalake.blob.core.windows.net/public`.
131+
132+
:::image type="content" source="media/connector-azure-blob-storage/anonymous-ui.png" alt-text="Screenshot of configuration for Anonymous examples UI.":::
133+
134+
96135
### Account key authentication
97136

98137
The following properties are supported for storage account key authentication in Azure Data Factory or Synapse pipelines:
@@ -849,7 +888,7 @@ To learn details about the properties, check [Delete activity](delete-activity.m
849888

850889
## Change data capture
851890

852-
Azure Data Factory can get new or changed files only from Azure Blob Storage by enabling **Enable change data capture ** in the mapping data flow source transformation. With this connector option, you can read new or updated files only and apply transformations before loading transformed data into destination datasets of your choice. Pleaser refer to [Change Data Capture](concepts-change-data-capture.md) for detials.
891+
Azure Data Factory can get new or changed files only from Azure Blob Storage by enabling **Enable change data capture ** in the mapping data flow source transformation. With this connector option, you can read new or updated files only and apply transformations before loading transformed data into destination datasets of your choice. Pleaser refer to [Change Data Capture](concepts-change-data-capture.md) for details.
853892

854893
.
855894

51 KB
Loading

0 commit comments

Comments
 (0)