Skip to content

Commit 7769d8a

Browse files
Merge pull request #296087 from abhishjain002/patch-2
Added HDI 5.1 support for HWC
2 parents 4a50d98 + 0bb889f commit 7769d8a

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

articles/hdinsight/interactive-query/apache-hive-warehouse-connector.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: reachnijel
55
ms.author: nijelsf
66
ms.service: azure-hdinsight
77
ms.topic: how-to
8-
ms.date: 01/02/2025
8+
ms.date: 03/11/2025
99
---
1010

1111
# Integrate Apache Spark and Apache Hive with Hive Warehouse Connector in Azure HDInsight
@@ -57,6 +57,7 @@ Hive Warehouse Connector needs separate clusters for Spark and Interactive Query
5757
|:---:|:---:|---|
5858
| v1 | Spark 2.4 \| HDI 4.0 | Interactive Query 3.1 \| HDI 4.0 |
5959
| v2 | Spark 3.1 \| HDI 5.0 | Interactive Query 3.1 \| HDI 5.0 |
60+
| v2.1 | Spark 3.3.0 \| HDI 5.1 | Interactive Query 3.1 \| HDI 5.1 |
6061

6162
### Create clusters
6263

@@ -170,9 +171,9 @@ This is a way to run Spark interactively through a modified version of the Scala
170171
171172
### Spark-submit
172173
173-
Spark-submit is a utility to submit any Spark program (or job) to Spark clusters.
174+
`Spark-submit` is a utility to submit any Spark program (or job) to Spark clusters.
174175
175-
The spark-submit job will set up and configure Spark and Hive Warehouse Connector as per our instructions, execute the program we pass to it, then cleanly release the resources that were being used.
176+
The `spark-submit` job will set up and configure Spark and Hive Warehouse Connector as per our instructions, execute the program we pass to it, then cleanly release the resources that were being used.
176177
177178
Once you build the scala/java code along with the dependencies into an assembly jar, use the below command to launch a Spark application. Replace `<VERSION>`, and `<APP_JAR_PATH>` with the actual values.
178179

0 commit comments

Comments
 (0)