You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Tutorial: Create an end-to-end data pipeline to derive sales insights in Azure HDInsight
@@ -22,23 +22,28 @@ If you don't have an Azure subscription, create a [free account](https://azure.m
22
22
23
23
## Prerequisites
24
24
25
-
* Azure CLI. See [Install the Azure CLI](https://docs.microsoft.com/cli/azure/install-azure-cli).
25
+
* Azure CLI - at least version 2.2.0. See [Install the Azure CLI](https://docs.microsoft.com/cli/azure/install-azure-cli).
26
+
27
+
* jq, a command-line JSON processor. See [https://stedolan.github.io/jq/](https://stedolan.github.io/jq/).
26
28
27
29
* A member of the [Azure built-in role - owner](../role-based-access-control/built-in-roles.md).
28
30
29
-
*[Power BI Desktop](https://www.microsoft.com/download/details.aspx?id=45331) to visualize business insights at the end of this tutorial.
31
+
* If using PowerShell to trigger the Data Factory pipeline, you'll need the [Az Module](https://docs.microsoft.com/powershell/azure/overview).
32
+
33
+
*[Power BI Desktop](https://aka.ms/pbiSingleInstaller) to visualize business insights at the end of this tutorial.
30
34
31
35
## Create resources
32
36
33
37
### Clone the repository with scripts and data
34
38
35
-
1.Sign in to the [Azure portal](https://portal.azure.com).
39
+
1.Log in to your Azure subscription. If you plan to use Azure Cloud Shell, then select **Try it** in the upper-right corner of the code block. Else, enter the command below:
36
40
37
-
1. Open Azure Cloud Shell from the top menu bar. Select your subscription for creating a file share if Cloud Shell prompts you.
1. In the **Select environment** drop-down menu, choose **Bash**.
44
+
# If you have multiple subscriptions, set the one to use
45
+
# az account set --subscription "SUBSCRIPTIONID"
46
+
```
42
47
43
48
1. Ensure you're a member of the Azure role [owner](../role-based-access-control/built-in-roles.md). Replace `[email protected]` with your account and then enter the command:
44
49
@@ -50,29 +55,7 @@ If you don't have an Azure subscription, create a [free account](https://azure.m
50
55
51
56
If no record is returned, you aren't a member and won't be able to complete this tutorial.
52
57
53
-
1. List your subscriptions entering the command:
54
-
55
-
```azurecli
56
-
az account list --output table
57
-
```
58
-
59
-
Note the ID of the subscription that you'll use for this project.
60
-
61
-
1. Set the subscription you'll use for this project. Replace `SUBSCRIPTIONID` with the actual value, then enter the command.
62
-
63
-
```azurecli
64
-
subscriptionID="SUBSCRIPTIONID"
65
-
az account set --subscription $subscriptionID
66
-
```
67
-
68
-
1. Create a new resource group for the project. Replace `RESOURCEGROUP` with the desired name, then enter the command.
69
-
70
-
```azurecli
71
-
resourceGroup="RESOURCEGROUP"
72
-
az group create --name $resourceGroup --location westus
73
-
```
74
-
75
-
1. Download the data and scripts for this tutorial from the [HDInsight sales insights ETL repository](https://github.com/Azure-Samples/hdinsight-sales-insights-etl). Enter the following command:
58
+
1. Download the data and scripts for this tutorial from the [HDInsight sales insights ETL repository](https://github.com/Azure-Samples/hdinsight-sales-insights-etl). Enter the following command:
If you're not sure which region to specify, you can retrieve a list of supported regions for your subscription with the [az account list-locations](https://docs.microsoft.com/cli/azure/account?view=azure-cli-latest#az-account-list-locations) command.
92
+
102
93
The command will deploy the following resources:
103
94
104
95
* An Azure Blob storage account. This account will hold the company sales data.
@@ -110,49 +101,26 @@ If you don't have an Azure subscription, create a [free account](https://azure.m
110
101
111
102
Cluster creation can take around 20 minutes.
112
103
113
-
The `resources.sh` script contains the following commands. It isn't required for you to run these commands if you already executed the script in the previous step.
114
-
115
-
* `az group deployment create` - This command uses an Azure Resource Manager template (`resourcestemplate.json`) to create the specified resources with the desired configuration.
116
-
117
-
```azurecli
118
-
az group deployment create --name ResourcesDeployment \
119
-
--resource-group $resourceGroup \
120
-
--template-file resourcestemplate.json \
121
-
--parameters "@resourceparameters.json"
122
-
```
123
-
124
-
* `az storage blob upload-batch` - This command uploads the sales data .csv files into the newly created Blob storage account by using this command:
The default password for SSH access to the clusters is `Thisisapassword1`. If you want to change the password, go to the `resourcesparameters.json` file and change the password for the `sparksshPassword`, `sparkClusterLoginPassword`, `llapClusterLoginPassword`, and `llapsshPassword` parameters.
104
+
The default password for SSH access to the clusters is `Thisisapassword1`. If you want to change the password, go to the `./templates/resourcesparameters_remainder.json` file and change the password for the `sparksshPassword`, `sparkClusterLoginPassword`, `llapClusterLoginPassword`, and `llapsshPassword` parameters.
132
105
133
106
### Verify deployment and collect resource information
134
107
135
-
1. If you want to check the status of your deployment, go to the resource group on the Azure portal. Select **Deployments** under **Settings**. Select the name of your deployment, `ResourcesDeployment`. Here you can see the resources that have successfully deployed and the resources that are still in progress.
108
+
1. If you want to check the status of your deployment, go to the resource group on the Azure portal. Under **Settings**, select **Deployments**, then your deployment. Here you can see the resources that have successfully deployed and the resources that are still in progress.
136
109
137
110
1. To view the names of the clusters, enter the following command:
@@ -186,10 +154,13 @@ This data factory will have one pipeline with two activities:
186
154
* The first activity will copy the data from Azure Blob storage to the Data Lake Storage Gen 2 storage account to mimic data ingestion.
187
155
* The second activity will transform the data in the Spark cluster. The script transforms the data by removing unwanted columns. It also appends a new column that calculates the revenue that a single transaction generates.
188
156
189
-
To set up your Azure Data Factory pipeline, execute the following command:
157
+
To set up your Azure Data Factory pipeline, execute the command below. You should still be at the `hdinsight-sales-insights-etl` directory.
@@ -200,35 +171,47 @@ This script does the following things:
200
171
1. Obtains storage keys for the Data Lake Storage Gen2 and Blob storage accounts.
201
172
1. Creates another resource deployment to create an Azure Data Factory pipeline, with its associated linked services and activities. It passes the storage keys as parameters to the template file so that the linked services can access the storage accounts correctly.
202
173
203
-
The Data Factory pipeline is deployed through the following command:
204
-
205
-
```azurecli-interactive
206
-
az group deployment create --name ADFDeployment \
207
-
--resource-group $resourceGroup \
208
-
--template-file adftemplate.json \
209
-
--parameters "@adfparameters.json"
210
-
```
211
-
212
174
## Run the data pipeline
213
175
214
176
### Trigger the Data Factory activities
215
177
216
178
The first activity in the Data Factory pipeline that you've created moves the data from Blob storage to Data Lake Storage Gen2. The second activity applies the Spark transformations on the data and saves the transformed .csv files to a new location. The entire pipeline might take a few minutes to finish.
217
179
218
-
To trigger the pipelines, you can either:
180
+
To retrieve the Data Factory name, enter the following command:
219
181
220
-
* Trigger the Data Factory pipelines in PowerShell. Replace `DataFactoryName` with the actual Data Factory name, then run the following commands:
* Trigger the Data Factory pipeline in PowerShell. Replace `RESOURCEGROUP`, and `DataFactoryName` with the appropriate values, then run the following commands:
Re-execute `Get-AzDataFactoryV2PipelineRun` as needed to monitor progress.
209
+
227
210
Or
228
211
229
-
* Open the data factory and select**Author& Monitor**. Trigger the copy pipeline and then the Spark pipeline from the portal. For information on triggering pipelines through the portal, see [Create on-demand Apache Hadoop clusters in HDInsight using Azure Data Factory](hdinsight-hadoop-create-linux-clusters-adf.md#trigger-a-pipeline).
212
+
* Open the data factory and select**Author& Monitor**. Trigger the `IngestAndTransform` pipeline from the portal. For information on triggering pipelines through the portal, see [Create on-demand Apache Hadoop clusters in HDInsight using Azure Data Factory](hdinsight-hadoop-create-linux-clusters-adf.md#trigger-a-pipeline).
230
213
231
-
To verify that the pipelines have run, you can take either of the following steps:
214
+
To verify that the pipeline has run, you can take either of the following steps:
232
215
233
216
* Go to the **Monitor** section in your data factory through the portal.
234
217
* In Azure Storage Explorer, go to your Data Lake Storage Gen 2 storage account. Go to the `files` file system, and then go to the `transformed` folder and check its contents to see if the pipeline succeeded.
@@ -237,37 +220,48 @@ For other ways to transform data by using HDInsight, see [this article on using
237
220
238
221
### Create a table on the Interactive Query cluster to view data on Power BI
239
222
240
-
1. Copy the `query.hql` file to the LLAP cluster by using SCP. Replace `LLAPCLUSTERNAME` with the actual name, then enter the command:
223
+
1. Copy the `query.hql` file to the LLAP cluster by using SCP. Enter the command:
2. Use SSH to access the LLAP cluster. Replace `LLAPCLUSTERNAME` with the actual name, then enter the command. If you haven't altered the `resourcesparameters.json` file, the password is `Thisisapassword1`.
230
+
Reminder: The default password is `Thisisapassword1`.
231
+
232
+
1. Use SSH to access the LLAP cluster. Enter the command:
This script will create a managed table on the Interactive Query cluster that you can access from Power BI.
244
+
This script will create a managed table on the Interactive Query cluster that you can access from Power BI.
259
245
260
246
### Create a Power BI dashboard from sales data
261
247
262
248
1. Open Power BI Desktop.
263
-
1. Select **Get Data**.
264
-
1. Search for **HDInsight Interactive Query cluster**.
265
-
1. Paste the URI for your cluster there. It should be in the format `https://LLAPCLUSTERNAME.azurehdinsight.net`.
266
249
267
-
Enter `default` for the database.
268
-
1. Enter the username and password that you use to access the cluster.
250
+
1. From the menu, navigate to **Get data**>**More...**>**Azure**>**HDInsight Interactive Query**.
269
251
270
-
After the data is loaded, you can experiment with the dashboard that you want to create. See the following links to get started with Power BI dashboards:
252
+
1. Select **Connect**.
253
+
254
+
1. From the **HDInsight Interactive Query** dialog:
255
+
1. In the **Server** text box, enter the name of your LLAP cluster in the format of `https://LLAPCLUSTERNAME.azurehdinsight.net`.
256
+
1. In the **database** text box, enter `default`.
257
+
1. Select **OK**.
258
+
259
+
1. From the **AzureHive** dialog:
260
+
1. In the **User name** text box, enter `admin`.
261
+
1. In the **Password** text box, enter `Thisisapassword1`.
262
+
1. Select **Connect**.
263
+
264
+
1. From **Navigator**, select`sales`, and/or `sales_raw` to preview the data. After the data is loaded, you can experiment with the dashboard that you want to create. See the following links to get started with Power BI dashboards:
271
265
272
266
* [Introduction to dashboards for Power BI designers](https://docs.microsoft.com/power-bi/service-dashboards)
273
267
* [Tutorial: Get started with the Power BI service](https://docs.microsoft.com/power-bi/service-get-started)
@@ -276,9 +270,18 @@ After the data is loaded, you can experiment with the dashboard that you want to
276
270
277
271
If you're not going to continue to use this application, delete all resources by using the following command so that you aren't charged for them.
278
272
279
-
```azurecli-interactive
280
-
az group delete -n $resourceGroup
281
-
```
273
+
1. To remove the resource group, enter the command:
274
+
275
+
```azurecli
276
+
az group delete -n $resourceGroup
277
+
```
278
+
279
+
1. To remove the service principal, enter the commands:
0 commit comments