Skip to content

Commit 9b6b619

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into rolyon-aadroles-custom-roles-assign-powershell-fix
2 parents 8f91bcc + 91e1b60 commit 9b6b619

6 files changed

+36
-88
lines changed

articles/app-service/configure-authentication-provider-aad.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ The App Service Authentication feature can automatically create an app registrat
2121
> [!NOTE]
2222
> The option to create a new registration is not available for government clouds. Instead, [define a registration separately](#advanced).
2323
24-
## <a name="express"> </a>Create a new app registration automatically
24+
## <a name="express"> </a> Option 1: Create a new app registration automatically
2525

2626
This option is designed to make enabling authentication simple and requires just a few clicks.
2727

@@ -42,7 +42,7 @@ You're now ready to use the Microsoft Identity Platform for authentication in yo
4242

4343
For an example of configuring Azure AD login for a web app that accesses Azure Storage and Microsoft Graph, see [this tutorial](scenario-secure-app-authentication-app-service.md).
4444

45-
## <a name="advanced"> </a>Use an existing registration created separately
45+
## <a name="advanced"> </a>Option 2: Use an existing registration created separately
4646

4747
You can also manually register your application for the Microsoft Identity Platform, customizing the registration and configuring App Service Authentication with the registration details. This is useful, for example, if you want to use an app registration from a different Azure AD tenant than the one your application is in.
4848

articles/azure-sql/database/recovery-using-backups.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,9 @@ When complete, the restore creates a new database on the same server as the orig
6464

6565
You generally restore a database to an earlier point for recovery purposes. You can treat the restored database as a replacement for the original database or use it as a data source to update the original database.
6666

67+
> [!IMPORTANT]
68+
> You can only run restore on the same server, cross-server restoration is not supported by Point-in-time restore.
69+
6770
- **Database replacement**
6871

6972
If you intend the restored database to be a replacement for the original database, you should specify the original database's compute size and service tier. You can then rename the original database and give the restored database the original name by using the [ALTER DATABASE](/sql/t-sql/statements/alter-database-azure-sql-database) command in T-SQL.

articles/storage/blobs/data-lake-storage-best-practices.md

Lines changed: 20 additions & 80 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: normesta
55
ms.subservice: data-lake-storage-gen2
66
ms.service: storage
77
ms.topic: conceptual
8-
ms.date: 09/28/2021
8+
ms.date: 10/06/2021
99
ms.author: normesta
1010
ms.reviewer: sachins
1111
---
@@ -146,92 +146,32 @@ Start by reviewing the recommendations in the [Security recommendations for Blob
146146

147147
Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lake-storage-access-control-model.md) article for guidance that is specific to Data Lake Storage Gen2 enabled accounts. This article helps you understand how to use Azure role-based access control (Azure RBAC) roles together with access control lists (ACLs) to enforce security permissions on directories and files in your hierarchical file system.
148148

149-
## Ingest data
149+
## Ingest, process, and analyze
150150

151-
This section highlights the different sources of data and the different ways in which that data can be ingested into a Data Lake Storage Gen2 account.
151+
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
152152

153-
#### Ad hoc data
153+
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
154154

155-
Smaller data sets that are used for prototyping a big data application. Consider using any of these tools to ingest data:
155+
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
156156

157-
- Azure portal
158-
- [Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md)
159-
- [Azure CLI](data-lake-storage-directory-file-acl-cli.md)
160-
- [REST](/rest/api/storageservices/data-lake-storage-gen2)
161-
- [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/)
162-
- [Apache DistCp](data-lake-storage-use-distcp.md)
163-
- [AzCopy](../common/storage-use-azcopy-v10.md)
157+
The following table recommends tools that you can use to ingest, analyze, visualize, and download data. Use the links in this table to find guidance about how to configure and use each tool.
164158

165-
#### Streamed data
159+
| Purpose | Tools & Tool guidance |
160+
|---|---|
161+
| Ingest ad hoc data| Azure portal, [Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/), [Apache DistCp](data-lake-storage-use-distcp.md), [AzCopy](../common/storage-use-azcopy-v10.md)|
162+
| Ingest streaming data | [HDInsight Storm](../../hdinsight/storm/apache-storm-write-data-lake-store.md), [Azure Stream Analytics](../../stream-analytics/stream-analytics-quick-create-portal.md) |
163+
| Ingest relational data | [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md) |
164+
| Ingest web server logs | [Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md)), [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md) |
165+
| Ingest from HDInsight clusters | [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md), [Apache DistCp](data-lake-storage-use-distcp.md), [AzCopy](../common/storage-use-azcopy-v10.md) |
166+
| Ingest from Hadoop clusters | [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md), [Apache DistCp](data-lake-storage-use-distcp.md), [WANdisco LiveData Migrator for Azure](migrate-gen2-wandisco-live-data-platform.md), [Azure Data Box](data-lake-storage-migrate-on-premises-hdfs-cluster.md) |
167+
| Ingest large data sets (several terabytes) | [Azure ExpressRoute](../../expressroute/expressroute-introduction.md) |
168+
| Process & analyze data | [Azure Synapse Analytics](../../synapse-analytics/get-started-analyze-storage.md), [Azure HDInsight](../../hdinsight/hdinsight-hadoop-use-data-lake-storage-gen2.md), [Databricks](/azure/databricks/scenarios/databricks-extract-load-sql-data-warehouse) |
169+
| Visualize data | [Power BI](/power-query/connectors/datalakestorage), [Azure Data Lake Storage query acceleration](data-lake-storage-query-acceleration.md) |
170+
| Download data | Azure portal, [PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md)), [Azure Storage Explorer](data-lake-storage-explorer.md), [AzCopy](../common/storage-use-azcopy-v10.md#transfer-data), [Azure Data Factory](../../data-factory/copy-activity-overview.md), [Apache DistCp](./data-lake-storage-use-distcp.md) |
166171

167-
Generated by various sources such as applications, devices, and sensors. Tools used to ingest this type of data usually capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. Consider using any of these tools to ingest data:
168-
169-
- [HDInsight Storm](../../hdinsight/storm/apache-storm-write-data-lake-store.md)
170-
- [Azure Stream Analytics](../../stream-analytics/stream-analytics-quick-create-portal.md)
171-
172-
#### Relational data
173-
174-
Relational databases collect a huge number of records, which can provide key insights if processed through a big data pipeline. We recommend that you use [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md) to ingest relational data.
175-
176-
#### Web server log data
177-
178-
Log files that contain information such as the history of page requests. Consider writing custom scripts or applications to upload this data so you'll have the flexibility to include your data uploading component as part of your larger big data application. Consider using these tools and SDKs:
179-
180-
- [Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md)
181-
- [Azure CLI](data-lake-storage-directory-file-acl-cli.md)
182-
- [REST](/rest/api/storageservices/data-lake-storage-gen2)
183-
- Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md))
184-
- [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)
185-
186-
#### HDInsight clusters
187-
188-
Most HDInsight cluster types (Hadoop, HBase, Storm) support Data Lake Storage Gen2 as a data storage repository. Consider using any of these tools to ingest data:
189-
190-
- [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)
191-
- [Apache DistCp](data-lake-storage-use-distcp.md)
192-
- [AzCopy](../common/storage-use-azcopy-v10.md)
193-
194-
#### Hadoop clusters
195-
196-
These clusters might running on-premise or in the cloud. Consider using any of these tools to ingest data:
197-
198-
- [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)
199-
- [Apache DistCp](data-lake-storage-use-distcp.md)
200-
- [WANdisco LiveData Migrator for Azure](migrate-gen2-wandisco-live-data-platform.md)
201-
- [Azure Data Box](data-lake-storage-migrate-on-premises-hdfs-cluster.md)
202-
203-
#### Large data sets
204-
205-
For uploading datasets that range in several terabytes, using the methods described above can sometimes be slow and costly. In such cases, you can use Azure ExpressRoute.
206-
207-
Azure ExpressRoute lets you create private connections between Azure data centers and infrastructure on your premises. This provides a reliable option for transferring large amounts of data. To learn more, see [Azure ExpressRoute documentation](../../expressroute/expressroute-introduction.md).
208-
209-
## Process and analyze data
210-
211-
Once the data is available in Data Lake Storage Gen2 you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance. The following sections recommend tools that you can use to analyze, visualize, and download data.
212-
213-
#### Tools for Analyzing data
214-
215-
- [Azure Synapse Analytics](../../synapse-analytics/get-started-analyze-storage.md)
216-
- [Azure HDInsight](../../hdinsight/hdinsight-hadoop-use-data-lake-storage-gen2.md)
217-
- [Databricks](/azure/databricks/scenarios/databricks-extract-load-sql-data-warehouse)
218-
219-
#### Tools for visualizing data
220-
221-
- [Power BI](/power-query/connectors/datalakestorage)
222-
- [Azure Data Lake Storage query acceleration](data-lake-storage-query-acceleration.md)
223-
224-
#### Tools for downloading data
172+
> [!NOTE]
173+
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md).
225174
226-
- Azure portal
227-
- [PowerShell](data-lake-storage-directory-file-acl-powershell.md)
228-
- [Azure CLI](data-lake-storage-directory-file-acl-cli.md)
229-
- [REST](/rest/api/storageservices/data-lake-storage-gen2)
230-
- Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md))
231-
- [Azure Storage Explorer](data-lake-storage-explorer.md)
232-
- [AzCopy](../common/storage-use-azcopy-v10.md#transfer-data)
233-
- [Azure Data Factory](../../data-factory/copy-activity-overview.md)
234-
- [Apache DistCp](./data-lake-storage-use-distcp.md)
235175

236176
## Monitor telemetry
237177

articles/storage/blobs/data-lake-storage-integrate-with-services-tutorials.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Find tutorials that help you learn how to use Azure services with A
44
author: normesta
55
ms.topic: conceptual
66
ms.author: normesta
7-
ms.date: 02/17/2021
7+
ms.date: 10/06/2021
88
ms.service: storage
99
ms.subservice: data-lake-storage-gen2
1010
---
@@ -32,8 +32,8 @@ This article contains links to tutorials that show you how to use various Azure
3232
| Azure Cognitive Search | [Index and search Azure Data Lake Storage Gen2 documents (preview)](../../search/search-howto-index-azure-data-lake-storage.md) |
3333

3434
> [!NOTE]
35-
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md)
35+
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md). To see how services organized into categories such as ingest, download, process, and visualize, see [Ingest, process, and analyze](data-lake-storage-data-scenarios.md#ingest-process-and-analyze).
3636
37-
## Next steps
37+
## See also
3838

39-
- Learn how these services can be used together to build workloads that ingest, download, process, and visualize data. See [Using Azure Data Lake Storage Gen2 for big data requirements](data-lake-storage-data-scenarios.md).
39+
[Best practices for using Azure Data Lake Storage Gen2](data-lake-storage-best-practices.md)

articles/storage/blobs/data-lake-storage-introduction.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -79,5 +79,6 @@ Several open source platforms support Data Lake Storage Gen2. For a complete lis
7979

8080
## See also
8181

82+
- [Best practices for using Azure Data Lake Storage Gen2](data-lake-storage-best-practices.md)
8283
- [Known issues with Azure Data Lake Storage Gen2](data-lake-storage-known-issues.md)
8384
- [Multi-protocol access on Azure Data Lake Storage](data-lake-storage-multi-protocol-access.md)

articles/storage/blobs/data-lake-storage-supported-azure-services.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: normesta
55
ms.subservice: data-lake-storage-gen2
66
ms.service: storage
77
ms.topic: conceptual
8-
ms.date: 02/17/2021
8+
ms.date: 10/06/2021
99
ms.author: normesta
1010
ms.reviewer: stewu
1111
---
@@ -41,9 +41,13 @@ This table lists the Azure services that you can use with Azure Data Lake Storag
4141
|Azure Content Delivery Network|Not yet supported|Not applicable|Not applicable|[Index and search Azure Data Lake Storage Gen2 documents (preview)](../../cdn/cdn-overview.md)|
4242
|Azure SQL Database|Not yet supported|Not applicable|Not applicable|[What is Azure SQL Database?](../../azure-sql/database/sql-database-paas-overview.md)|
4343

44+
> [!TIP]
45+
> To see how services organized into categories such as ingest, download, process, and visualize, see [Ingest, process, and analyze](data-lake-storage-data-scenarios.md#ingest-process-and-analyze).
46+
4447
## See also
4548

4649
- [Known issues with Azure Data Lake Storage Gen2](data-lake-storage-known-issues.md)
4750
- [Blob Storage feature support in Azure Storage accounts](storage-feature-support-in-storage-accounts.md)
4851
- [Open source platforms that support Azure Data Lake Storage Gen2](data-lake-storage-supported-open-source-platforms.md)
49-
- [Multi-protocol access on Azure Data Lake Storage](data-lake-storage-multi-protocol-access.md)
52+
- [Multi-protocol access on Azure Data Lake Storage](data-lake-storage-multi-protocol-access.md)
53+
- [Best practices for using Azure Data Lake Storage Gen2](data-lake-storage-best-practices.md)

0 commit comments

Comments
 (0)