You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/interactive-query/apache-hive-warehouse-connector.md
+5-2Lines changed: 5 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -153,7 +153,8 @@ This is a way to run Spark interactively through a modified version of the Scala
153
153
154
154
### Spark-submit
155
155
156
-
Spark-submit is a utility to submit any spark program (or job) to Spark clusters.
156
+
Spark-submit is a utility to submit any Spark program (or job) to Spark clusters.
157
+
157
158
The spark-submit job will setup and configure Spark and Hive Warehouse Connector as per our instructions, execute the program we pass to it, then cleanly release the resources that were being used.
158
159
159
160
Once you build the scala/java code along with the dependencies into an assembly jar, use the below command to launch a Spark application. Replace `<VERSION>`, and `<APP_JAR_PATH>` with the actual values.
@@ -182,7 +183,8 @@ Once you build the scala/java code along with the dependencies into an assembly
182
183
```
183
184
184
185
This utility is also used when we have written the entire application in pySpark and packaged into py files (Python), so that we can submit the entire code to Spark cluster for execution.
185
-
For Python applications, simply pass a .py file in the place of /<APP_JAR_PATH>/myHwcAppProject.jar, and add the below configuration (Python .zip) file to the search path with --py-files.
186
+
187
+
For Python applications, simply pass a .py file in the place of `/<APP_JAR_PATH>/myHwcAppProject.jar`, and add the below configuration (Python .zip) file to the search path with `--py-files`.
* [Examples of interacting withHiveWarehouseConnector using Zeppelin, Livy, spark-submit, and pyspark](https://community.hortonworks.com/articles/223626/integrating-apache-hive-with-apache-spark-hive-war.html)
240
+
* [SubmittingSparkApplications via Spark-submit utility](https://spark.apache.org/docs/2.4.0/submitting-applications.html)
0 commit comments