You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/active-directory/saas-apps/infor-cloudsuite-provisioning-tutorial.md
+8Lines changed: 8 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -68,6 +68,14 @@ Before configuring and enabling automatic user provisioning, you should decide w
68
68
5. To generate the bearer token, copy the **User Identifier** and **SCIM Password**. Paste them in notepad++ separated by a colon. Encode the string value by navigating to **Plugins > MIME Tools > Basic64 Encode**.
69
69
70
70
:::image type="content" source="media/infor-cloudsuite-provisioning-tutorial/token.png" alt-text="Screenshot of a Notepad++ document. In the Plugins menu, MIME tools is highlighted. In the MIME tools menu, Base64 encode is highlighted." border="false":::
71
+
72
+
To generate the bearer token using PowerShell instead of Notepad++, use the following commands:
3. Copy the bearer token. This value will be entered in the Secret Token field in the Provisioning tab of your Infor CloudSuite application in the Azure portal.
This article outlines how to use the Copy activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to Azure Blob storage. It also describes how to use the Data Flow activity to transform data in Azure Blob storage. To learn more read the [Azure Data Factory](introduction.md) and the [Azure Synapse Analytics](..\synapse-analytics\overview-what-is.md) introduction articles.
22
+
This article outlines how to use the Copy activity in Azure Data Factory and Azure Synapse pipelines to copy data from and to Azure Blob Storage. It also describes how to use the Data Flow activity to transform data in Azure Blob Storage. To learn more read the [Azure Data Factory](introduction.md) and the [Azure Synapse Analytics](..\synapse-analytics\overview-what-is.md) introduction articles.
23
23
24
24
>[!TIP]
25
25
>To learn about a migration scenario for a data lake or a data warehouse, see the article [Migrate data from your data lake or data warehouse to Azure](data-migration-guidance-overview.md).
26
26
27
27
## Supported capabilities
28
28
29
-
This Azure Blob storage connector is supported for the following capabilities:
29
+
This Azure Blob Storage connector is supported for the following capabilities:
@@ -91,7 +91,7 @@ This Blob storage connector supports the following authentication types. See the
91
91
>- When you use PolyBase or COPY statement to load data into Azure Synapse Analytics, if your source or staging Blob storage is configured with an Azure Virtual Network endpoint, you must use managed identity authentication as required by Azure Synapse. See the [Managed identity authentication](#managed-identity) section for more configuration prerequisites.
92
92
93
93
>[!NOTE]
94
-
>Azure HDInsight and Azure Machine Learning activities only support authentication that uses Azure Blob storage account keys.
94
+
>Azure HDInsight and Azure Machine Learning activities only support authentication that uses Azure Blob Storage account keys.
95
95
96
96
### Account key authentication
97
97
@@ -244,17 +244,17 @@ To use service principal authentication, follow these steps:
244
244
- Application key
245
245
- Tenant ID
246
246
247
-
2. Grant the service principal proper permission in Azure Blob storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
247
+
2. Grant the service principal proper permission in Azure Blob Storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
248
248
249
249
-**As source**, in **Access control (IAM)**, grant at least the **Storage Blob Data Reader** role.
250
250
-**As sink**, in **Access control (IAM)**, grant at least the **Storage Blob Data Contributor** role.
251
251
252
-
These properties are supported for an Azure Blob storage linked service:
252
+
These properties are supported for an Azure Blob Storage linked service:
253
253
254
254
| Property | Description | Required |
255
255
|:--- |:--- |:--- |
256
256
| type | The **type** property must be set to **AzureBlobStorage**. | Yes |
257
-
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
257
+
| serviceEndpoint | Specify the Azure Blob Storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
258
258
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
| servicePrincipalCredentialType | The credential type to use for service principal authentication. Allowed values are **ServicePrincipalKey** and **ServicePrincipalCert**. | Yes |
@@ -304,17 +304,17 @@ For general information about Azure Storage authentication, see [Authenticate ac
304
304
305
305
1.[Retrieve system-assigned managed identity information](data-factory-service-identity.md#retrieve-managed-identity) by copying the value of the system-assigned managed identity object ID generated along with your factory or Synapse workspace.
306
306
307
-
2. Grant the managed identity permission in Azure Blob storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
307
+
2. Grant the managed identity permission in Azure Blob Storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
308
308
309
309
-**As source**, in **Access control (IAM)**, grant at least the **Storage Blob Data Reader** role.
310
310
-**As sink**, in **Access control (IAM)**, grant at least the **Storage Blob Data Contributor** role.
311
311
312
-
These properties are supported for an Azure Blob storage linked service:
312
+
These properties are supported for an Azure Blob Storage linked service:
313
313
314
314
| Property | Description | Required |
315
315
|:--- |:--- |:--- |
316
316
| type | The **type** property must be set to **AzureBlobStorage**. | Yes |
317
-
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
317
+
| serviceEndpoint | Specify the Azure Blob Storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
318
318
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
319
319
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
320
320
@@ -342,20 +342,20 @@ A data factory can be assigned with one or multiple [user-assigned managed ident
342
342
343
343
For general information about Azure storage authentication, see [Authenticate access to Azure Storage using Azure Active Directory](../storage/blobs/authorize-access-azure-active-directory.md). To use user-assigned managed identity authentication, follow these steps:
344
344
345
-
1.[Create one or multiple user-assigned managed identities](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md) and grant permission in Azure Blob storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
345
+
1.[Create one or multiple user-assigned managed identities](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md) and grant permission in Azure Blob Storage. For more information on the roles, see [Use the Azure portal to assign an Azure role for access to blob and queue data](../storage/blobs/assign-azure-role-data-access.md).
346
346
347
347
-**As source**, in **Access control (IAM)**, grant at least the **Storage Blob Data Reader** role.
348
348
-**As sink**, in **Access control (IAM)**, grant at least the **Storage Blob Data Contributor** role.
349
349
350
350
2. Assign one or multiple user-assigned managed identities to your data factory and [create credentials](credentials.md) for each user-assigned managed identity.
351
351
352
352
353
-
These properties are supported for an Azure Blob storage linked service:
353
+
These properties are supported for an Azure Blob Storage linked service:
354
354
355
355
| Property | Description | Required |
356
356
|:--- |:--- |:--- |
357
357
| type | The **type** property must be set to **AzureBlobStorage**. | Yes |
358
-
| serviceEndpoint | Specify the Azure Blob storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
358
+
| serviceEndpoint | Specify the Azure Blob Storage service endpoint with the pattern of `https://<accountName>.blob.core.windows.net/`. | Yes |
359
359
| accountKind | Specify the kind of your storage account. Allowed values are: **Storage** (general purpose v1), **StorageV2** (general purpose v2), **BlobStorage**, or **BlockBlobStorage**. <br/><br/>When using Azure Blob linked service in data flow, managed identity or service principal authentication is not supported when account kind as empty or "Storage". Specify the proper account kind, choose a different authentication, or upgrade your storage account to general purpose v2. | No |
360
360
| credentials | Specify the user-assigned managed identity as the credential object. | Yes |
361
361
| connectVia | The [integration runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use the Azure integration runtime or the self-hosted integration runtime (if your data store is in a private network). If this property isn't specified, the service uses the default Azure integration runtime. | No |
@@ -400,7 +400,7 @@ For a full list of sections and properties available for defining datasets, see
@@ -610,11 +610,11 @@ This section describes the resulting behavior of the Copy operation for differen
610
610
611
611
## Preserving metadata during copy
612
612
613
-
When you copy files from Amazon S3, Azure Blob storage, or Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2 or Azure Blob storage, you can choose to preserve the file metadata along with data. Learn more from [Preserve metadata](copy-activity-preserve-metadata.md#preserve-metadata).
613
+
When you copy files from Amazon S3, Azure Blob Storage, or Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2 or Azure Blob Storage, you can choose to preserve the file metadata along with data. Learn more from [Preserve metadata](copy-activity-preserve-metadata.md#preserve-metadata).
614
614
615
615
## Mapping data flow properties
616
616
617
-
When you're transforming data in mapping data flows, you can read and write files from Azure Blob storage in the following formats:
617
+
When you're transforming data in mapping data flows, you can read and write files from Azure Blob Storage in the following formats:
@@ -627,7 +627,7 @@ Format specific settings are located in the documentation for that format. For m
627
627
628
628
### Source transformation
629
629
630
-
In source transformation, you can read from a container, folder, or individual file in Azure Blob storage. Use the **Source options** tab to manage how the files are read.
630
+
In source transformation, you can read from a container, folder, or individual file in Azure Blob Storage. Use the **Source options** tab to manage how the files are read.
@@ -690,7 +690,7 @@ In this case, all files that were sourced under `/data/sales` are moved to `/bac
690
690
691
691
### Sink properties
692
692
693
-
In the sink transformation, you can write to either a container or a folder in Azure Blob storage. Use the **Settings** tab to manage how the files get written.
693
+
In the sink transformation, you can write to either a container or a folder in Azure Blob Storage. Use the **Settings** tab to manage how the files get written.
Copy file name to clipboardExpand all lines: articles/static-web-apps/custom-domain.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ The following table includes links to articles that demonstrate how to configure
26
26
27
27
## About domains
28
28
29
-
Setting up an apex domain is a common scenario to configure once your domain name is set up. Creating an apex domain is achieved by configuring an `ALIAS` or `ANAME` record or through `CNAME` flattening. Some domain registrars like GoDaddy and Google don't support these DNS records. If your domain registrar doesn't support the all the DNS records you need, consider using [Azure DNS to configure your domain](custom-domain-azure-dns.md).
29
+
Setting up an apex domain is a common scenario to configure once your domain name is set up. Creating an apex domain is achieved by configuring an `ALIAS` or `ANAME` record or through `CNAME` flattening. Some domain registrars like GoDaddy and Google don't support these DNS records. If your domain registrar doesn't support all the DNS records you need, consider using [Azure DNS to configure your domain](custom-domain-azure-dns.md).
30
30
31
31
The following are terms you'll encounter as your set up a custom domain.
Copy file name to clipboardExpand all lines: articles/storage/blobs/data-lake-storage-best-practices.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -161,7 +161,7 @@ Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lak
161
161
162
162
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
163
163
164
-
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
164
+
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server logs, which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
165
165
166
166
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
0 commit comments