You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this tutorial, you learn how to create an [Apache Hadoop](./hadoop/apache-hadoop-introduction.md) cluster, on demand, in Azure HDInsight using Azure Data Factory. You then use data pipelines in Azure Data Factory to run Hive jobs and delete the cluster. By the end of this tutorial, you learn how to operationalize a big data job run where cluster creation, job run, and cluster deletion are performed on a schedule.
17
+
In this tutorial, you learn how to create an [Apache Hadoop](./hadoop/apache-hadoop-introduction.md) cluster, on demand, in Azure HDInsight using Azure Data Factory. You then use data pipelines in Azure Data Factory to run Hive jobs and delete the cluster. By the end of this tutorial, you learn how to `operationalize` a big data job run where cluster creation, job run, and cluster deletion are done on a schedule.
18
18
19
19
This tutorial covers the following tasks:
20
20
@@ -38,9 +38,9 @@ If you don't have an Azure subscription, [create a free account](https://azure.m
38
38
39
39
## Create preliminary Azure objects
40
40
41
-
In this section, you create various objects that will be used for the HDInsight cluster you create on-demand. The created storage account will contain the sample [HiveQL](https://cwiki.apache.org/confluence/display/Hive/LanguageManual) script, `partitionweblogs.hql`, that you use to simulate a sample Apache Hive job that runs on the cluster.
41
+
In this section, you create various objects that will be used for the HDInsight cluster you create on-demand. The created storage account will contain the sample HiveQL script, `partitionweblogs.hql`, that you use to simulate a sample Apache Hive job that runs on the cluster.
42
42
43
-
This section uses an Azure PowerShell script to create the storage account and copy over the required files within the storage account. The Azure PowerShell sample script in this section performs the following tasks:
43
+
This section uses an Azure PowerShell script to create the storage account and copy over the required files within the storage account. The Azure PowerShell sample script in this section does the following tasks:
44
44
45
45
1. Signs in to Azure.
46
46
2. Creates an Azure resource group.
@@ -154,7 +154,7 @@ Write-host "`nScript completed" -ForegroundColor Green
154
154
1. Select the resource group name you created in your PowerShell script. Use the filter if you have too many resource groups listed.
155
155
1. From the **Overview** view, you see one resource listed unless you share the resource group with other projects. That resource is the storage account with the name you specified earlier. Select the storage account name.
156
156
1. Select the **Containers** tile.
157
-
1. Select the **adfgetstarted** container. You see a folder called **hivescripts**.
157
+
1. Select the **adfgetstarted** container. You see a folder called **`hivescripts`**.
158
158
1. Open the folder and make sure it contains the sample script file, **partitionweblogs.hql**.
159
159
160
160
## Understand the Azure Data Factory activity
@@ -170,7 +170,7 @@ In this article, you configure the Hive activity to create an on-demand HDInsigh
170
170
171
171
1. An HDInsight Hadoop cluster is automatically created for you just-in-time to process the slice.
172
172
173
-
2. The input data is processed by running a HiveQL script on the cluster. In this tutorial, the HiveQL script associated with the hive activity performs the following actions:
173
+
2. The input data is processed by running a HiveQL script on the cluster. In this tutorial, the HiveQL script associated with the hive activity does the following actions:
174
174
175
175
* Uses the existing table (*hivesampletable*) to create another table **HiveSampleOut**.
176
176
* Populates the **HiveSampleOut** table with only specific columns from the original *hivesampletable*.
@@ -181,7 +181,7 @@ In this article, you configure the Hive activity to create an on-demand HDInsigh
181
181
182
182
1. Sign in to the [Azure portal](https://portal.azure.com/).
183
183
184
-
2. From the left menu, navigate to **+ Create a resource** > **Analytics** > **Data Factory**.
184
+
2. From the left menu, navigate to **`+ Create a resource`** > **Analytics** > **Data Factory**.
185
185
186
186

187
187
@@ -260,7 +260,7 @@ In this section, you author two linked services within your data factory.
260
260
| Time to live | Provide the duration for which you want the HDInsight cluster to be available before being automatically deleted.|
261
261
| Service principal ID | Provide the application ID of the Azure Active Directory service principal you created as part of the prerequisites. |
262
262
| Service principal key | Provide the authentication key for the Azure Active Directory service principal. |
263
-
| Cluster name prefix | Provide a value that will be prefixed to all the cluster types that are created by the data factory. |
263
+
| Cluster name prefix | Provide a value that will be prefixed to all the cluster types created by the data factory. |
264
264
|Subscription |Select your subscription from the drop-down list.|
265
265
| Select resource group | Select the resource group you created as part of the PowerShell script you used earlier.|
266
266
| OS type/Cluster SSH user name | Enter an SSH user name, commonly `sshuser`. |
@@ -282,7 +282,7 @@ In this section, you author two linked services within your data factory.
282
282
283
283

284
284
285
-
1. Make sure you have the Hive activity selected, select the **HDI Cluster** tab, and from the **HDInsight Linked Service** drop-down list, select the linked service you created earlier, **HDInsightLinkedService**, for HDInsight.
285
+
1. Make sure you have the Hive activity selected, select the **HDI Cluster** tab. And from the **HDInsight Linked Service** drop-down list, select the linked service you created earlier, **HDInsightLinkedService**, for HDInsight.
286
286
287
287

288
288
@@ -294,9 +294,9 @@ In this section, you author two linked services within your data factory.
294
294
295
295

296
296
297
-
1. Under **Advanced** > **Parameters**, select **Auto-fill from script**. This option looks for any parameters in the Hive script that require values at runtime.
297
+
1. Under **Advanced** > **Parameters**, select **`Auto-fill from script`**. This option looks for any parameters in the Hive script that require values at runtime.
298
298
299
-
1. In the **value** text box, add the existing folder in the format `wasbs://adfgetstarted@<StorageAccount>.blob.core.windows.net/outputfolder/`. The path is case-sensitive. This is the path where the output of the script will be stored. The `wasbs` schema is necessary because storage accounts now have secure transfer required enabled by default.
299
+
1. In the **value** text box, add the existing folder in the format `wasbs://adfgetstarted@<StorageAccount>.blob.core.windows.net/outputfolder/`. The path is case-sensitive. This path is where the output of the script will be stored. The `wasbs` schema is necessary because storage accounts now have secure transfer required enabled by default.
300
300
301
301

302
302
@@ -342,9 +342,9 @@ In this section, you author two linked services within your data factory.
342
342
343
343
## Clean up resources
344
344
345
-
With the on-demand HDInsight cluster creation, you don't need to explicitly delete the HDInsight cluster. The cluster is deleted based on the configuration you provided while creating the pipeline. However, even after the cluster is deleted, the storage accounts associated with the cluster continue to exist. This behavior is by design so that you can keep your data intact. However, if you don't want to persist the data, you may delete the storage account you created.
345
+
With the on-demand HDInsight cluster creation, you don't need to explicitly delete the HDInsight cluster. The cluster is deleted based on the configuration you provided while creating the pipeline. Even after the cluster is deleted, the storage accounts associated with the cluster continue to exist. This behavior is by design so that you can keep your data intact. However, if you don't want to persist the data, you may delete the storage account you created.
346
346
347
-
Alternatively, you can delete the entire resource group that you created for this tutorial. This deletes the storage account and the Azure Data Factory that you created.
347
+
Or, you can delete the entire resource group that you created for this tutorial. This process deletes the storage account and the Azure Data Factory that you created.
348
348
349
349
### Delete the resource group
350
350
@@ -354,13 +354,13 @@ Alternatively, you can delete the entire resource group that you created for thi
354
354
1. On the **Resources** tile, you shall have the default storage account and the data factory listed unless you share the resource group with other projects.
355
355
1. Select **Delete resource group**. Doing so deletes the storage account and the data stored in the storage account.
1. Enter the resource group name to confirm deletion, and then select **Delete**.
360
360
361
361
## Next steps
362
362
363
-
In this article, you learned how to use Azure Data Factory to create on-demand HDInsight cluster and run [Apache Hive](https://hive.apache.org/) jobs. Advance to the next article to learn how to create HDInsight clusters with custom configuration.
363
+
In this article, you learned how to use Azure Data Factory to create on-demand HDInsight cluster and run Apache Hive jobs. Advance to the next article to learn how to create HDInsight clusters with custom configuration.
364
364
365
365
> [!div class="nextstepaction"]
366
366
> [Create Azure HDInsight clusters with custom configuration](hdinsight-hadoop-provision-linux-clusters.md)
0 commit comments