You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/event-hubs/store-captured-data-data-warehouse.md
+17-15Lines changed: 17 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,18 +1,18 @@
1
1
# Process and migrate captured Event Hubs data to a SQL Data Warehouse using Event Grid and Azure Functions
2
2
3
-
Event Hubs [Capture](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview) is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB. In this tutorial, you learn how you to capture data from your event hub into a SQL Database Warehouse by using an [Event Grid](https://docs.microsoft.com/azure/event-grid/overview) triggered Azure Function.
3
+
Event Hubs [Capture](https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview) is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB. In this tutorial, you learn how you to capture data from your event hub into a SQL data warehouse by using an Azure function triggered by an [event grid](https://docs.microsoft.com/azure/event-grid/overview).
* First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files.
7
+
* First, you create an event hub with the **Capture** feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files.
8
8
* Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination.
9
-
* Whenever a new Avro file is delivered to the Azure Storage blob by Event Hubs Capture, Event Grid notifies the Azure Function with the blob URI. The Function then migrates the data from the Storage blob to a SQL data warehouse.
9
+
* Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The Function then migrates data from the blob to a SQL data warehouse.
10
10
11
11
In this tutorial, you do the following actions:
12
12
13
13
> [!div class="checklist"]
14
14
> * Deploy the infrastructure
15
-
> * Publish code to the Functions App
15
+
> * Publish code to a Functions App
16
16
> * Create an Event Grid subscription from the Functions app
17
17
> * Stream sample data into Event Hub.
18
18
> * Verify captured data in SQL Data Warehouse
@@ -25,7 +25,7 @@ In this tutorial, you do the following actions:
25
25
-*FunctionDWDumper* – An Azure Function that receives an Event Grid notification when an Avro file is captured to the Azure Storage blob. It receives the blob’s URI path, reads its contents, and pushes this data to a SQL Data Warehouse.
26
26
27
27
### Deploy the infrastructure
28
-
Deploy the infrastructure needed for this tutorial by using this [Azure Resource Manager template](https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/event-grid/EventHubsDataMigration.json). This template creates the following resources:
28
+
Use Azure PowerShell or Azure CLI to deploy the infrastructure needed for this tutorial by using this [Azure Resource Manager template](https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/event-grid/EventHubsDataMigration.json). This template creates the following resources:
29
29
30
30
- Event Hub with the Capture feature enabled
31
31
- Storage account for the captured event data
@@ -36,7 +36,8 @@ Deploy the infrastructure needed for this tutorial by using this [Azure Resource
36
36
37
37
The following sections provide Azure CLI and Azure PowerShell commands for deploying the infrastructure required for the tutorial. Update names of the following objects before running the commands:
38
38
39
-
- Azure resource group
39
+
- Azure resource group
40
+
- Region for the resource group
40
41
- Event Hubs namespace
41
42
- Event hub
42
43
- Azure SQL server
@@ -45,13 +46,13 @@ The following sections provide Azure CLI and Azure PowerShell commands for deplo
45
46
- Azure Storage
46
47
- Azure Functions App
47
48
48
-
These scripts take a while to create all the Azure artifcts. Wait until the script completes before proceeding further.
49
+
These scripts take some time to create all the Azure artifacts. Wait until the script completes before proceeding further. If the deployment fails for some reason, delete the resource group, fix the reported issue, and rerun the command.
49
50
50
51
#### Azure CLI
51
52
To deploy the template using Azure CLI, use the following commands:
52
53
53
54
```azurecli-interactive
54
-
az group create -l westcentralus -n rgDataMigrationSample
55
+
az group create -l westus -n rgDataMigrationSample
Create a table in your SQL data warehouse by running the [CreateDataWarehouseTable.sql](https://github.com/Azure/azure-event-hubs/blob/master/samples/e2e/EventHubsCaptureEventGridDemo/scripts/CreateDataWarehouseTable.sql) script using Visual Studio or the Query Editor in the portal.
74
+
Create a table in your SQL data warehouse by running the [CreateDataWarehouseTable.sql](https://github.com/Azure/azure-event-hubs/blob/master/samples/e2e/EventHubsCaptureEventGridDemo/scripts/CreateDataWarehouseTable.sql) script using [Visual Studio](../sql-data-warehouse/sql-data-warehouse-query-visual-studio.md), [SQL Server Management Studio](../sql-data-warehouse/sql-data-warehouse-query-ssms.md), or the Query Editor in the portal.
73
75
74
76
```sql
75
77
CREATE TABLE [dbo].[Fact_WindTurbineMetrics] (
@@ -102,7 +104,7 @@ WITH (CLUSTERED COLUMNSTORE INDEX, DISTRIBUTION = ROUND_ROBIN);
You have now set up your Event Hub, SQL data warehouse, Azure Function App, and Event Grid subscription. Upon completing the simple configuration below, you can run WindTurbineDataGenerator.exe to generate data streams to the Event Hub.
129
+
You have now set up your Event Hub, SQL data warehouse, Azure Function App, and Event Grid subscription. You can run WindTurbineDataGenerator.exe to generate data streams to the Event Hub after updating connection string and name of your event hub in the source code.
128
130
129
131
1. In the portal, select your event hub namespace. Select **Connection Strings**.
130
132
@@ -140,7 +142,7 @@ You have now set up your Event Hub, SQL data warehouse, Azure Function App, and
140
142
141
143
4. Go back to your Visual Studio project. In the *WindTurbineDataGenerator* project, open *program.cs*.
142
144
143
-
5.Replace the two constant values. Use the copied value for **EventHubConnectionString**. Use**hubdatamigration**the event hub name.
145
+
5.Update valuesfor **EventHubConnectionString** and**EventHubName**with connection string and name of your event hub.
@@ -150,12 +152,12 @@ You have now set up your Event Hub, SQL data warehouse, Azure Function App, and
150
152
6. Build the solution, then run the WindTurbineGenerator.exe application.
151
153
152
154
## Verify captured data in data warehouse
153
-
After a couple of minutes, query the table in your data warehouse. You observe that data generated by the WindTurbineDataGenerator has been streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.
155
+
After a couple of minutes, query the table in your SQL data warehouse. You observe that the data generated by WindTurbineDataGenerator has been streamed to your Event Hub, captured into an Azure Storage container, and then migrated into the SQL Data Warehouse table by Azure Function.
154
156
155
157
## Next steps
156
-
You can use powerful data visualization tools with your data warehouse to achieve your Actionable insights.
158
+
You can use powerful data visualization tools with your data warehouse to achieve actionable insights.
157
159
158
160
This article shows how to use [Power BI with SQL Data Warehouse](https://docs.microsoft.com/azure/sql-data-warehouse/sql-data-warehouse-integrate-power-bi)
159
161
160
-
Now you are all set to plug in the UI you need to get valuable business insights for your management.
0 commit comments