You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/app-service/configure-authentication-provider-aad.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ The App Service Authentication feature can automatically create an app registrat
21
21
> [!NOTE]
22
22
> The option to create a new registration is not available for government clouds. Instead, [define a registration separately](#advanced).
23
23
24
-
## <aname="express"> </a>Create a new app registration automatically
24
+
## <aname="express"> </a> Option 1: Create a new app registration automatically
25
25
26
26
This option is designed to make enabling authentication simple and requires just a few clicks.
27
27
@@ -42,7 +42,7 @@ You're now ready to use the Microsoft Identity Platform for authentication in yo
42
42
43
43
For an example of configuring Azure AD login for a web app that accesses Azure Storage and Microsoft Graph, see [this tutorial](scenario-secure-app-authentication-app-service.md).
44
44
45
-
## <aname="advanced"> </a>Use an existing registration created separately
45
+
## <aname="advanced"> </a>Option 2: Use an existing registration created separately
46
46
47
47
You can also manually register your application for the Microsoft Identity Platform, customizing the registration and configuring App Service Authentication with the registration details. This is useful, for example, if you want to use an app registration from a different Azure AD tenant than the one your application is in.
Copy file name to clipboardExpand all lines: articles/azure-sql/database/recovery-using-backups.md
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,6 +64,9 @@ When complete, the restore creates a new database on the same server as the orig
64
64
65
65
You generally restore a database to an earlier point for recovery purposes. You can treat the restored database as a replacement for the original database or use it as a data source to update the original database.
66
66
67
+
> [!IMPORTANT]
68
+
> You can only run restore on the same server, cross-server restoration is not supported by Point-in-time restore.
69
+
67
70
-**Database replacement**
68
71
69
72
If you intend the restored database to be a replacement for the original database, you should specify the original database's compute size and service tier. You can then rename the original database and give the restored database the original name by using the [ALTER DATABASE](/sql/t-sql/statements/alter-database-azure-sql-database) command in T-SQL.
Copy file name to clipboardExpand all lines: articles/storage/blobs/data-lake-storage-best-practices.md
+20-80Lines changed: 20 additions & 80 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ author: normesta
5
5
ms.subservice: data-lake-storage-gen2
6
6
ms.service: storage
7
7
ms.topic: conceptual
8
-
ms.date: 09/28/2021
8
+
ms.date: 10/06/2021
9
9
ms.author: normesta
10
10
ms.reviewer: sachins
11
11
---
@@ -146,92 +146,32 @@ Start by reviewing the recommendations in the [Security recommendations for Blob
146
146
147
147
Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lake-storage-access-control-model.md) article for guidance that is specific to Data Lake Storage Gen2 enabled accounts. This article helps you understand how to use Azure role-based access control (Azure RBAC) roles together with access control lists (ACLs) to enforce security permissions on directories and files in your hierarchical file system.
148
148
149
-
## Ingest data
149
+
## Ingest, process, and analyze
150
150
151
-
This section highlights the different sources of data and the different ways in which that data can be ingested into a Data Lake Storage Gen2 account.
151
+
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
152
152
153
-
#### Ad hoc data
153
+
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
154
154
155
-
Smaller data sets that are used for prototyping a big data application. Consider using any of these tools to ingest data:
155
+
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
The following table recommends tools that you can use to ingest, analyze, visualize, and download data. Use the links in this table to find guidance about how to configure and use each tool.
164
158
165
-
#### Streamed data
159
+
| Purpose | Tools & Tool guidance |
160
+
|---|---|
161
+
| Ingest ad hoc data| Azure portal, [Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/), [Apache DistCp](data-lake-storage-use-distcp.md), [AzCopy](../common/storage-use-azcopy-v10.md)|
162
+
| Ingest streaming data |[HDInsight Storm](../../hdinsight/storm/apache-storm-write-data-lake-store.md), [Azure Stream Analytics](../../stream-analytics/stream-analytics-quick-create-portal.md)|
163
+
| Ingest relational data |[Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)|
164
+
| Ingest web server logs |[Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md)), [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)|
165
+
| Ingest from HDInsight clusters |[Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md), [Apache DistCp](data-lake-storage-use-distcp.md), [AzCopy](../common/storage-use-azcopy-v10.md)|
166
+
| Ingest from Hadoop clusters |[Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md), [Apache DistCp](data-lake-storage-use-distcp.md), [WANdisco LiveData Migrator for Azure](migrate-gen2-wandisco-live-data-platform.md), [Azure Data Box](data-lake-storage-migrate-on-premises-hdfs-cluster.md)|
167
+
| Ingest large data sets (several terabytes) |[Azure ExpressRoute](../../expressroute/expressroute-introduction.md)|
168
+
| Process & analyze data |[Azure Synapse Analytics](../../synapse-analytics/get-started-analyze-storage.md), [Azure HDInsight](../../hdinsight/hdinsight-hadoop-use-data-lake-storage-gen2.md), [Databricks](/azure/databricks/scenarios/databricks-extract-load-sql-data-warehouse)|
169
+
| Visualize data |[Power BI](/power-query/connectors/datalakestorage), [Azure Data Lake Storage query acceleration](data-lake-storage-query-acceleration.md)|
170
+
| Download data | Azure portal, [PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md)), [Azure Storage Explorer](data-lake-storage-explorer.md), [AzCopy](../common/storage-use-azcopy-v10.md#transfer-data), [Azure Data Factory](../../data-factory/copy-activity-overview.md), [Apache DistCp](./data-lake-storage-use-distcp.md)|
166
171
167
-
Generated by various sources such as applications, devices, and sensors. Tools used to ingest this type of data usually capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. Consider using any of these tools to ingest data:
Relational databases collect a huge number of records, which can provide key insights if processed through a big data pipeline. We recommend that you use [Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md) to ingest relational data.
175
-
176
-
#### Web server log data
177
-
178
-
Log files that contain information such as the history of page requests. Consider writing custom scripts or applications to upload this data so you'll have the flexibility to include your data uploading component as part of your larger big data application. Consider using these tools and SDKs:
- Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md))
184
-
-[Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)
185
-
186
-
#### HDInsight clusters
187
-
188
-
Most HDInsight cluster types (Hadoop, HBase, Storm) support Data Lake Storage Gen2 as a data storage repository. Consider using any of these tools to ingest data:
189
-
190
-
-[Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)
191
-
-[Apache DistCp](data-lake-storage-use-distcp.md)
192
-
-[AzCopy](../common/storage-use-azcopy-v10.md)
193
-
194
-
#### Hadoop clusters
195
-
196
-
These clusters might running on-premise or in the cloud. Consider using any of these tools to ingest data:
197
-
198
-
-[Azure Data Factory](../../data-factory/connector-azure-data-lake-store.md)
199
-
-[Apache DistCp](data-lake-storage-use-distcp.md)
200
-
-[WANdisco LiveData Migrator for Azure](migrate-gen2-wandisco-live-data-platform.md)
201
-
-[Azure Data Box](data-lake-storage-migrate-on-premises-hdfs-cluster.md)
202
-
203
-
#### Large data sets
204
-
205
-
For uploading datasets that range in several terabytes, using the methods described above can sometimes be slow and costly. In such cases, you can use Azure ExpressRoute.
206
-
207
-
Azure ExpressRoute lets you create private connections between Azure data centers and infrastructure on your premises. This provides a reliable option for transferring large amounts of data. To learn more, see [Azure ExpressRoute documentation](../../expressroute/expressroute-introduction.md).
208
-
209
-
## Process and analyze data
210
-
211
-
Once the data is available in Data Lake Storage Gen2 you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance. The following sections recommend tools that you can use to analyze, visualize, and download data.
-[Azure Data Lake Storage query acceleration](data-lake-storage-query-acceleration.md)
223
-
224
-
#### Tools for downloading data
172
+
> [!NOTE]
173
+
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md).
Copy file name to clipboardExpand all lines: articles/storage/blobs/data-lake-storage-integrate-with-services-tutorials.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Find tutorials that help you learn how to use Azure services with A
4
4
author: normesta
5
5
ms.topic: conceptual
6
6
ms.author: normesta
7
-
ms.date: 02/17/2021
7
+
ms.date: 10/06/2021
8
8
ms.service: storage
9
9
ms.subservice: data-lake-storage-gen2
10
10
---
@@ -32,8 +32,8 @@ This article contains links to tutorials that show you how to use various Azure
32
32
| Azure Cognitive Search |[Index and search Azure Data Lake Storage Gen2 documents (preview)](../../search/search-howto-index-azure-data-lake-storage.md)|
33
33
34
34
> [!NOTE]
35
-
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md)
35
+
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md). To see how services organized into categories such as ingest, download, process, and visualize, see [Ingest, process, and analyze](data-lake-storage-data-scenarios.md#ingest-process-and-analyze).
36
36
37
-
## Next steps
37
+
## See also
38
38
39
-
- Learn how these services can be used together to build workloads that ingest, download, process, and visualize data. See [Using Azure Data Lake Storage Gen2 for big data requirements](data-lake-storage-data-scenarios.md).
39
+
[Best practices for using Azure Data Lake Storage Gen2](data-lake-storage-best-practices.md)
> To see how services organized into categories such as ingest, download, process, and visualize, see [Ingest, process, and analyze](data-lake-storage-data-scenarios.md#ingest-process-and-analyze).
46
+
44
47
## See also
45
48
46
49
-[Known issues with Azure Data Lake Storage Gen2](data-lake-storage-known-issues.md)
47
50
-[Blob Storage feature support in Azure Storage accounts](storage-feature-support-in-storage-accounts.md)
48
51
-[Open source platforms that support Azure Data Lake Storage Gen2](data-lake-storage-supported-open-source-platforms.md)
49
-
-[Multi-protocol access on Azure Data Lake Storage](data-lake-storage-multi-protocol-access.md)
52
+
-[Multi-protocol access on Azure Data Lake Storage](data-lake-storage-multi-protocol-access.md)
53
+
-[Best practices for using Azure Data Lake Storage Gen2](data-lake-storage-best-practices.md)
0 commit comments