Skip to content

Commit 3afd960

Browse files
authored
Merge pull request #98310 from matthewtorr-msft/patch-1
Update data-lake-storage-best-practices.md
2 parents 76d5884 + 5974830 commit 3afd960

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/storage/blobs/data-lake-storage-best-practices.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -161,7 +161,7 @@ Then, review the [Access control model in Azure Data Lake Storage Gen2](data-lak
161161

162162
There are many different sources of data and different ways in which that data can be ingested into a Data Lake Storage Gen2 enabled account.
163163

164-
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
164+
For example, you can ingest large sets of data from HDInsight and Hadoop clusters or smaller sets of *ad hoc* data for prototyping applications. You can ingest streamed data that is generated by various sources such as applications, devices, and sensors. For this type of data, you can use tools to capture and process the data on an event-by-event basis in real time, and then write the events in batches into your account. You can also ingest web server logs, which contain information such as the history of page requests. For log data, consider writing custom scripts or applications to upload them so that you'll have the flexibility to include your data uploading component as part of your larger big data application.
165165

166166
Once the data is available in your account, you can run analysis on that data, create visualizations, and even download data to your local machine or to other repositories such as an Azure SQL database or SQL Server instance.
167167

0 commit comments

Comments
 (0)