Skip to content

Commit 135294f

Browse files
committed
Some edits
1 parent 8c8d71e commit 135294f

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

articles/storage/blobs/data-lake-storage-best-practices.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,9 @@ Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lak
148148

149149
## Ingest, process, and analyze
150150

151-
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account. For example, you can ingest large sets of data from HD Insight and Hadoop clusters or you can ingest smaller sets of *ad hoc* data for prototyping applications. Streamed data is generated by various sources such as applications, devices, and sensors. You can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. Web server logs contain information such as the history of page requests. Consider writing custom scripts or applications to upload web server logs so you'll have the flexibility to include your data uploading component as part of your larger big data application.
151+
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
152+
153+
For example, you can ingest large sets of data from HD Insight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
152154

153155
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
154156

0 commit comments

Comments
 (0)