Skip to content

Commit 2f25ebb

Browse files
authored
Add missing link navigation
1 parent a3f25ff commit 2f25ebb

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/hdinsight/hadoop/apache-hadoop-using-apache-hive-as-an-etl-tool.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ The ETL model is typically used when you want to:
7676
7777
* Load stream data or large volumes of semi-structured or unstructured data from external sources into an existing database or information system.
7878
* Clean, transform, and validate the data before loading it, perhaps by using more than one transformation pass through the cluster.
79-
* Generate reports and visualizations that are regularly updated. For example, if the report takes too long to generate during the day, you can schedule the report to run at night. To automatically run a Hive query, you can use [Azure Logic Apps](../logic-apps/logic-apps-overview.md) and PowerShell.
79+
* Generate reports and visualizations that are regularly updated. For example, if the report takes too long to generate during the day, you can schedule the report to run at night. To automatically run a Hive query, you can use [Azure Logic Apps](../../logic-apps/logic-apps-overview.md) and PowerShell.
8080
8181
If the target for the data isn't a database, you can generate a file in the appropriate format within the query, for example a CSV. This file can then be imported into Excel or Power BI.
8282

0 commit comments

Comments
 (0)