Skip to content

Commit 703b0a6

Browse files
authored
Merge pull request #210750 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 530103b + 3afd960 commit 703b0a6

File tree

6 files changed

+34
-26
lines changed

6 files changed

+34
-26
lines changed

articles/active-directory/saas-apps/infor-cloudsuite-provisioning-tutorial.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,14 @@ Before configuring and enabling automatic user provisioning, you should decide w
6868
5. To generate the bearer token, copy the **User Identifier** and **SCIM Password**. Paste them in notepad++ separated by a colon. Encode the string value by navigating to **Plugins > MIME Tools > Basic64 Encode**.
6969

7070
:::image type="content" source="media/infor-cloudsuite-provisioning-tutorial/token.png" alt-text="Screenshot of a Notepad++ document. In the Plugins menu, MIME tools is highlighted. In the MIME tools menu, Base64 encode is highlighted." border="false":::
71+
72+
To generate the bearer token using PowerShell instead of Notepad++, use the following commands:
73+
```powershell
74+
$Identifier = "<User Identifier>"
75+
$SCIMPassword = "<SCIM Password>"
76+
$bytes = [System.Text.Encoding]::UTF8.GetBytes($($Identifier):$($SCIMPassword))
77+
[Convert]::ToBase64String($bytes)
78+
```
7179

7280
3. Copy the bearer token. This value will be entered in the Secret Token field in the Provisioning tab of your Infor CloudSuite application in the Azure portal.
7381

articles/azure-web-pubsub/concept-service-internals.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -239,7 +239,7 @@ The server is by nature an authorized user. With the help of the *event handler
239239
240240
It can also grant or revoke publish/join permissions for a PubSub client:
241241
- Grant publish/join permissions to some specific group or to all groups
242-
- Revoke publish/joinh permissions for some specific group or for all groups
242+
- Revoke publish/join permissions for some specific group or for all groups
243243
- Check if the client has permission to join or publish to some specific group or to all groups
244244
245245
The service provides REST APIs for the server to do connection management.

articles/data-factory/connector-azure-blob-storage.md

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Copy and transform data in Azure Blob storage
2+
title: Copy and transform data in Azure Blob Storage
33
titleSuffix: Azure Data Factory & Azure Synapse
44
description: Learn how to copy data to and from Blob storage, and transform data in Blob storage using Azure Data Factory or Azure Synapse Analytics.
55
ms.author: jianleishen
@@ -11,22 +11,22 @@ ms.custom: synapse
1111
ms.date: 08/24/2022
1212
---
1313

14-
# Copy and transform data in Azure Blob storage by using Azure Data Factory or Azure Synapse Analytics
14+
# Copy and transform data in Azure Blob Storage by using Azure Data Factory or Azure Synapse Analytics
1515

1616
> [!div class="op_single_selector" title1="Select the version of Data Factory service you're using:"]
1717
> - [Version 1](v1/data-factory-azure-blob-connector.md)
1818
> - [Current version](connector-azure-blob-storage.md)
1919
2020
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
2121

22-
This article outlines how to use the Copy activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to Azure Blob storage. It also describes how to use the Data Flow activity to transform data in Azure Blob storage. To learn more read the [Azure Data Factory](introduction.md) and the [Azure Synapse Analytics](..\synapse-analytics\overview-what-is.md) introduction articles.
22+
This article outlines how to use the Copy activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to Azure Blob Storage. It also describes how to use the Data Flow activity to transform data in Azure Blob Storage. To learn more read the [Azure Data Factory](introduction.md) and the [Azure Synapse Analytics](..\synapse-analytics\overview-what-is.md) introduction articles.
2323

2424
>[!TIP]
2525
>To learn about a migration scenario for a data lake or a data warehouse, see the article [Migrate data from your data lake or data warehouse to Azure](data-migration-guidance-overview.md).
2626
2727
## Supported capabilities
2828

29-
This Azure Blob storage connector is supported for the following capabilities:
29+
This Azure Blob Storage connector is supported for the following capabilities:
3030

3131
| Supported capabilities|IR | Managed private endpoint|
3232
|---------| --------| --------|
@@ -91,7 +91,7 @@ This Blob storage connector supports the following authentication types. See the
9191
>- When you use PolyBase or COPY statement to load data into Azure Synapse Analytics, if your source or staging Blob storage is configured with an Azure Virtual Network endpoint, you must use managed identity authentication as required by Azure Synapse. See the [Managed identity authentication](#managed-identity) section for more configuration prerequisites.
9292
9393
>[!NOTE]
94-
>Azure HDInsight and Azure Machine Learning activities only support authentication that uses Azure Blob storage account keys.
94+
>Azure HDInsight and Azure Machine Learning activities only support authentication that uses Azure Blob Storage account keys.
9595
9696
### Account key authentication
9797

@@ -244,17 +244,17 @@ To use service principal authentication, follow these steps:
244244
- Application key
245245
- Tenant ID
246246

247-
2. Grant the service principal proper permission in Azure Blob storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
247+
2. Grant the service principal proper permission in Azure Blob Storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
248248

249249
- **As source**, in **Access control (IAM)**, grant at least the **Storage Blob Data Reader** role.
250250
- **As sink**, in **Access control (IAM)**, grant at least the **Storage Blob Data Contributor** role.
251251

252-
These properties are supported for an Azure Blob storage linked service:
252+
These properties are supported for an Azure Blob Storage linked service:
253253

254254
| Property | Description | Required |
255255
|:--- |:--- |:--- |
256256
| type | The **type** property must be set to **AzureBlobStorage**. | Yes |
257-
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
257+
| serviceEndpoint | Specify the Azure Blob Storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
258258
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
259259
| servicePrincipalId | Specify the application's client ID. | Yes |
260260
| servicePrincipalCredentialType | The credential type to use for service principal authentication. Allowed values are **ServicePrincipalKey** and **ServicePrincipalCert**. | Yes |
@@ -304,17 +304,17 @@ For general information about Azure Storage authentication, see [Authenticate ac
304304

305305
1. [Retrieve system-assigned managed identity information](data-factory-service-identity.md#retrieve-managed-identity) by copying the value of the system-assigned managed identity object ID generated along with your factory or Synapse workspace.
306306

307-
2. Grant the managed identity permission in Azure Blob storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
307+
2. Grant the managed identity permission in Azure Blob Storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
308308

309309
- **As source**, in **Access control (IAM)**, grant at least the **Storage Blob Data Reader** role.
310310
- **As sink**, in **Access control (IAM)**, grant at least the **Storage Blob Data Contributor** role.
311311

312-
These properties are supported for an Azure Blob storage linked service:
312+
These properties are supported for an Azure Blob Storage linked service:
313313

314314
| Property | Description | Required |
315315
|:--- |:--- |:--- |
316316
| type | The **type** property must be set to **AzureBlobStorage**. | Yes |
317-
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
317+
| serviceEndpoint | Specify the Azure Blob Storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
318318
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
319319
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
320320

@@ -342,20 +342,20 @@ A data factory can be assigned with one or multiple [user-assigned managed ident
342342

343343
For general information about Azure storage authentication, see [Authenticate access to Azure Storage using Azure Active Directory](../storage/blobs/authorize-access-azure-active-directory.md). To use user-assigned managed identity authentication, follow these steps:
344344

345-
1. [Create one or multiple user-assigned managed identities](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md) and grant permission in Azure Blob storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
345+
1. [Create one or multiple user-assigned managed identities](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md) and grant permission in Azure Blob Storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
346346

347347
- **As source**, in **Access control (IAM)**, grant at least the **Storage Blob Data Reader** role.
348348
- **As sink**, in **Access control (IAM)**, grant at least the **Storage Blob Data Contributor** role.
349349

350350
2. Assign one or multiple user-assigned managed identities to your data factory and [create credentials](credentials.md) for each user-assigned managed identity.
351351

352352

353-
These properties are supported for an Azure Blob storage linked service:
353+
These properties are supported for an Azure Blob Storage linked service:
354354

355355
| Property | Description | Required |
356356
|:--- |:--- |:--- |
357357
| type | The **type** property must be set to **AzureBlobStorage**. | Yes |
358-
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
358+
| serviceEndpoint | Specify the Azure Blob Storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
359359
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
360360
| credentials | Specify the user-assigned managed identity as the credential object. | Yes |
361361
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
@@ -400,7 +400,7 @@ For a full list of sections and properties available for defining datasets, see
400400

401401
[!INCLUDE [data-factory-v2-file-formats](includes/data-factory-v2-file-formats.md)]
402402

403-
The following properties are supported for Azure Blob storage under `location` settings in a format-based dataset:
403+
The following properties are supported for Azure Blob Storage under `location` settings in a format-based dataset:
404404

405405
| Property | Description | Required |
406406
| ---------- | ------------------------------------------------------------ | -------- |
@@ -444,7 +444,7 @@ For a full list of sections and properties available for defining activities, se
444444

445445
[!INCLUDE [data-factory-v2-file-formats](includes/data-factory-v2-file-formats.md)]
446446

447-
The following properties are supported for Azure Blob storage under `storeSettings` settings in a format-based copy source:
447+
The following properties are supported for Azure Blob Storage under `storeSettings` settings in a format-based copy source:
448448

449449
| Property | Description | Required |
450450
| ------------------------ | ------------------------------------------------------------ | --------------------------------------------- |
@@ -515,7 +515,7 @@ The following properties are supported for Azure Blob storage under `storeSettin
515515

516516
[!INCLUDE [data-factory-v2-file-sink-formats](includes/data-factory-v2-file-sink-formats.md)]
517517

518-
The following properties are supported for Azure Blob storage under `storeSettings` settings in a format-based copy sink:
518+
The following properties are supported for Azure Blob Storage under `storeSettings` settings in a format-based copy sink:
519519

520520
| Property | Description | Required |
521521
| ------------------------ | ------------------------------------------------------------ | -------- |
@@ -610,11 +610,11 @@ This section describes the resulting behavior of the Copy operation for differen
610610

611611
## Preserving metadata during copy
612612

613-
When you copy files from Amazon S3, Azure Blob storage, or Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2 or Azure Blob storage, you can choose to preserve the file metadata along with data. Learn more from [Preserve metadata](copy-activity-preserve-metadata.md#preserve-metadata).
613+
When you copy files from Amazon S3, Azure Blob Storage, or Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2 or Azure Blob Storage, you can choose to preserve the file metadata along with data. Learn more from [Preserve metadata](copy-activity-preserve-metadata.md#preserve-metadata).
614614

615615
## Mapping data flow properties
616616

617-
When you're transforming data in mapping data flows, you can read and write files from Azure Blob storage in the following formats:
617+
When you're transforming data in mapping data flows, you can read and write files from Azure Blob Storage in the following formats:
618618

619619
- [Avro](format-avro.md#mapping-data-flow-properties)
620620
- [Delimited text](format-delimited-text.md#mapping-data-flow-properties)
@@ -627,7 +627,7 @@ Format specific settings are located in the documentation for that format. For m
627627

628628
### Source transformation
629629

630-
In source transformation, you can read from a container, folder, or individual file in Azure Blob storage. Use the **Source options** tab to manage how the files are read.
630+
In source transformation, you can read from a container, folder, or individual file in Azure Blob Storage. Use the **Source options** tab to manage how the files are read.
631631

632632
:::image type="content" source="media/data-flow/sourceOptions1.png" alt-text="Source options":::
633633

@@ -690,7 +690,7 @@ In this case, all files that were sourced under `/data/sales` are moved to `/bac
690690

691691
### Sink properties
692692

693-
In the sink transformation, you can write to either a container or a folder in Azure Blob storage. Use the **Settings** tab to manage how the files get written.
693+
In the sink transformation, you can write to either a container or a folder in Azure Blob Storage. Use the **Settings** tab to manage how the files get written.
694694

695695
:::image type="content" source="media/data-flow/file-sink-settings.png" alt-text="Sink options":::
696696

@@ -745,7 +745,7 @@ To learn details about the properties, check [Delete activity](delete-activity.m
745745
"properties": {
746746
"type": "AzureBlob",
747747
"linkedServiceName": {
748-
"referenceName": "<Azure Blob storage linked service name>",
748+
"referenceName": "<Azure Blob Storage linked service name>",
749749
"type": "LinkedServiceReference"
750750
},
751751
"typeProperties": {

articles/static-web-apps/custom-domain.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ The following table includes links to articles that demonstrate how to configure
2626

2727
## About domains
2828

29-
Setting up an apex domain is a common scenario to configure once your domain name is set up. Creating an apex domain is achieved by configuring an `ALIAS` or `ANAME` record or through `CNAME` flattening. Some domain registrars like GoDaddy and Google don't support these DNS records. If your domain registrar doesn't support the all the DNS records you need, consider using [Azure DNS to configure your domain](custom-domain-azure-dns.md).
29+
Setting up an apex domain is a common scenario to configure once your domain name is set up. Creating an apex domain is achieved by configuring an `ALIAS` or `ANAME` record or through `CNAME` flattening. Some domain registrars like GoDaddy and Google don't support these DNS records. If your domain registrar doesn't support all the DNS records you need, consider using [Azure DNS to configure your domain](custom-domain-azure-dns.md).
3030

3131
The following are terms you'll encounter as your set up a custom domain.
3232

articles/storage/blobs/data-lake-storage-best-practices.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -161,7 +161,7 @@ Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lak
161161

162162
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
163163

164-
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
164+
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server logs, which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
165165

166166
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
167167

0 commit comments

Comments
 (0)