Skip to content

Commit 72e4044

Browse files
committed
Fixed Broken URLs
1 parent c958fdd commit 72e4044

File tree

4 files changed

+12
-12
lines changed

4 files changed

+12
-12
lines changed

articles/hdinsight/TOC.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -763,11 +763,11 @@
763763
- name: Use Python with Apache Hive and Apache Pig
764764
href: ./hadoop/python-udf-hdinsight.md
765765
- name: HWC integration with Apache Spark and Apache Hive
766-
href: ./interactive-query/hive-warehouse-connector.md
766+
href: ./interactive-query/apache-hive-warehouse-connector.md
767767
- name: HWC and Apache Spark operations
768-
href: ./interactive-query/hive-warehouse-connector-operations.md
768+
href: ./interactive-query/apache-hive-warehouse-connector-operations.md
769769
- name: HWC integration with Apache Zeppelin
770-
href: ./interactive-query/hive-warehouse-connector-zeppelin.md
770+
href: ./interactive-query/apache-hive-warehouse-connector-zeppelin.md
771771
- name: Apache Hive with Hadoop
772772
href: ./hadoop/hdinsight-use-hive.md
773773
- name: Use the Apache Hive View

articles/hdinsight/interactive-query/hive-warehouse-connector-operations.md renamed to articles/hdinsight/interactive-query/apache-hive-warehouse-connector-operations.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ This article shows spark-based operations supported by Hive Warehouse Connector
1515

1616
## Prerequisite
1717

18-
Complete the [Hive Warehouse Connector setup](./hive-warehouse-connector.md#hive-warehouse-connector-setup) steps.
18+
Complete the [Hive Warehouse Connector setup](./apache-hive-warehouse-connector.md#hive-warehouse-connector-setup) steps.
1919

2020
## Getting started
2121

@@ -137,6 +137,6 @@ Use **Ctrl + C** to stop netcat on the second SSH session. Use `:q` to exit spar
137137

138138
## Next steps
139139

140-
* [HWC integration with Apache Spark and Apache Hive](./hive-warehouse-connector.md)
140+
* [HWC integration with Apache Spark and Apache Hive](./apache-hive-warehouse-connector.md)
141141
* [Use Interactive Query with HDInsight](./apache-interactive-query-get-started.md)
142-
* [HWC integration with Apache Zeppelin](./interactive-query/hive-warehouse-connector-zeppelin.md)
142+
* [HWC integration with Apache Zeppelin](./apache-hive-warehouse-connector-zeppelin.md)

articles/hdinsight/interactive-query/hive-warehouse-connector-zeppelin.md renamed to articles/hdinsight/interactive-query/apache-hive-warehouse-connector-zeppelin.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ HDInsight Spark clusters include Apache Zeppelin notebooks with different interp
1515

1616
## Prerequisite
1717

18-
Complete the [Hive Warehouse Connector setup](hive-warehouse-connector.md#hive-warehouse-connector-setup) steps.
18+
Complete the [Hive Warehouse Connector setup](apache-hive-warehouse-connector.md#hive-warehouse-connector-setup) steps.
1919

2020
## Getting started
2121

@@ -129,6 +129,6 @@ hive.executeQuery("select * from testers").show()
129129

130130
## Next steps
131131

132-
* [HWC and Apache Spark operations](./hive-warehouse-connector-operations.md)
133-
* [HWC integration with Apache Spark and Apache Hive](./hive-warehouse-connector.md)
132+
* [HWC and Apache Spark operations](./apache-hive-warehouse-connector-operations.md)
133+
* [HWC integration with Apache Spark and Apache Hive](./apache-hive-warehouse-connector.md)
134134
* [Use Interactive Query with HDInsight](./apache-interactive-query-get-started.md)

articles/hdinsight/interactive-query/hive-warehouse-connector.md renamed to articles/hdinsight/interactive-query/apache-hive-warehouse-connector.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ You can choose between a few different methods to connect to your Interactive Qu
101101

102102
* [Spark-shell / PySpark](../spark/apache-spark-shell.md)
103103
* [Spark-submit](#spark-submit)
104-
* [Zeppelin](./hive-warehouse-connector-zeppelin.md)
104+
* [Zeppelin](./apache-hive-warehouse-connector-zeppelin.md)
105105

106106

107107
Below are some examples to connect to HWC from Spark.
@@ -211,7 +211,7 @@ kinit USERNAME
211211

212212
## Next steps
213213

214-
* [HWC and Apache Spark operations](./hive-warehouse-connector-operations.md)
214+
* [HWC and Apache Spark operations](./apache-hive-warehouse-connector-operations.md)
215215
* [Use Interactive Query with HDInsight](./apache-interactive-query-get-started.md)
216-
* [HWC integration with Apache Zeppelin](./interactive-query/hive-warehouse-connector-zeppelin.md)
216+
* [HWC integration with Apache Zeppelin](./apache-hive-warehouse-connector-zeppelin.md)
217217
* [Examples of interacting with Hive Warehouse Connector using Zeppelin, Livy, spark-submit, and pyspark](https://community.hortonworks.com/articles/223626/integrating-apache-hive-with-apache-spark-hive-war.html)

0 commit comments

Comments
 (0)