You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/storage/blobs/data-lake-storage-best-practices.md
+8-5Lines changed: 8 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -148,19 +148,19 @@ Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lak
148
148
149
149
## Ingest, process, and analyze
150
150
151
-
There are many different sources of data and the different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
151
+
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
152
152
153
-
You can also ingest large sets of data from HD Insight or Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications.
153
+
For example, you can ingest large sets of data from HD Insight and Hadoop clusters or you can ingest smaller sets of *ad hoc* data for prototyping applications.
154
154
155
155
Streamed data is generated by various sources such as applications, devices, and sensors. You can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account.
156
156
157
157
Web server logs contain information such as the history of page requests. Consider writing custom scripts or applications to upload web server logs so you'll have the flexibility to include your data uploading component as part of your larger big data application.
158
158
159
-
Once the data is available in Data Lake Storage Gen2 you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
159
+
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
160
160
161
-
The following table recommend tools that you can use to ingest, analyze, visualize, and download data. Use the links in this table to find guidance about how to configure and use each tool.
161
+
The following table recommends tools that you can use to ingest, analyze, visualize, and download data. Use the links in this table to find guidance about how to configure and use each tool.
162
162
163
-
| Purpose |Recommended tools|
163
+
| Purpose |Tools & Tool guidance|
164
164
|---|---|
165
165
| Ingest ad hoc data| Azure portal, [Azure PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/), [Apache DistCp](data-lake-storage-use-distcp.md), [AzCopy](../common/storage-use-azcopy-v10.md)|
166
166
| Ingest streaming data |[HDInsight Storm](../../hdinsight/storm/apache-storm-write-data-lake-store.md), [Azure Stream Analytics](../../stream-analytics/stream-analytics-quick-create-portal.md)|
@@ -173,6 +173,9 @@ The following table recommend tools that you can use to ingest, analyze, visuali
173
173
| Visualize data |[Power BI](/power-query/connectors/datalakestorage), [Azure Data Lake Storage query acceleration](data-lake-storage-query-acceleration.md)|
174
174
| Download data | Azure portal, [PowerShell](data-lake-storage-directory-file-acl-powershell.md), [Azure CLI](data-lake-storage-directory-file-acl-cli.md), [REST](/rest/api/storageservices/data-lake-storage-gen2), Azure SDKs ([.NET](data-lake-storage-directory-file-acl-dotnet.md), [Java](data-lake-storage-directory-file-acl-java.md), [Python](data-lake-storage-directory-file-acl-python.md), and [Node.js](data-lake-storage-directory-file-acl-javascript.md)), [Azure Storage Explorer](data-lake-storage-explorer.md), [AzCopy](../common/storage-use-azcopy-v10.md#transfer-data), [Azure Data Factory](../../data-factory/copy-activity-overview.md), [Apache DistCp](./data-lake-storage-use-distcp.md)|
175
175
176
+
> [!NOTE]
177
+
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md).
Copy file name to clipboardExpand all lines: articles/storage/blobs/data-lake-storage-integrate-with-services-tutorials.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,8 +32,8 @@ This article contains links to tutorials that show you how to use various Azure
32
32
| Azure Cognitive Search |[Index and search Azure Data Lake Storage Gen2 documents (preview)](../../search/search-howto-index-azure-data-lake-storage.md)|
33
33
34
34
> [!NOTE]
35
-
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md)
35
+
> This table doesn't reflect the complete list of Azure services that support Data Lake Storage Gen2. To see a list of supported Azure services, their level of support, see [Azure services that support Azure Data Lake Storage Gen2](data-lake-storage-supported-azure-services.md).
36
36
37
-
## Next steps
37
+
## Next Steps
38
38
39
-
- Learn how these services can be used together to build workloads that ingest, download, process, and visualize data. See [Using Azure Data Lake Storage Gen2 for big data requirements](data-lake-storage-data-scenarios.md).
39
+
- Learn how these services can be used together to build workloads that ingest, download, process, and visualize data. See [Ingest, process, and analyze](data-lake-storage-data-scenarios.md#ingest-process-and-analyze).
> To see how these services can be used together to build workloads that ingest, download, process, and visualize data. See [Ingest, process, and analyze](data-lake-storage-data-scenarios.md#ingest-process-and-analyze).
46
+
44
47
## See also
45
48
46
49
-[Known issues with Azure Data Lake Storage Gen2](data-lake-storage-known-issues.md)
0 commit comments