You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Replace `<headnode-FQDN>` with the fully qualified domain name of a cluster headnode. To find the fully qualified domain name of a headnode, use the information in the [Manage HDInsight using the Apache Ambari REST API](../hdinsight-hadoop-manage-ambari-rest-api.md#get-the-fqdn-of-cluster-nodes) document.
39
+
39
40
---
40
41
41
42
### To HDInsight Enterprise Security Package (ESP) cluster using Kerberos
@@ -79,11 +80,11 @@ Private endpoints point to a basic load balancer, which can only be accessed fro
79
80
80
81
### Use Beeline with Apache Spark
81
82
82
-
Apache Spark provides its own implementation of HiveServer2, which is sometimes referred to as the Spark Thrift server. This service uses Spark SQL to resolve queries instead of Hive, and may provide better performance depending on your query.
83
+
Apache Spark provides its own implementation of HiveServer2, which is sometimes referred to as the Spark Thrift server. This service uses Spark SQL to resolve queries instead of Hive. And may provide better performance depending on your query.
83
84
84
85
#### Through public or private endpoints
85
86
86
-
The connection string used is slightly different. Instead of containing `httpPath=/hive2` it's`httpPath/sparkhive2`. Replace `clustername` with the name of your HDInsight cluster. Replace `admin` with the cluster login account for your cluster. For ESP clusters, use the full UPN (for example, [email protected]). Replace `password` with the password for the cluster login account.
87
+
The connection string used is slightly different. Instead of containing `httpPath=/hive2` it uses`httpPath/sparkhive2`. Replace `clustername` with the name of your HDInsight cluster. Replace `admin` with the cluster login account for your cluster. For ESP clusters, use the full UPN (for example, [email protected]). Replace `password` with the password for the cluster login account.
@@ -113,7 +114,7 @@ When connecting directly from the cluster head node, or from a resource inside t
113
114
114
115
* A Hadoop cluster on HDInsight. See [Get Started with HDInsight on Linux](./apache-hadoop-linux-tutorial-get-started.md).
115
116
116
-
* Notice the [URI scheme](../hdinsight-hadoop-linux-information.md#URI-and-scheme) for your cluster's primary storage. For example, `wasb://` for Azure Storage, `abfs://` for Azure Data Lake Storage Gen2, or `adl://` for Azure Data Lake Storage Gen1. If secure transfer is enabled for Azure Storage, the URI is `wasbs://`. For more information, see [secure transfer](../../storage/common/storage-require-secure-transfer.md).
117
+
* Notice the URI scheme for your cluster's primary storage. For example, `wasb://` for Azure Storage, `abfs://` for Azure Data Lake Storage Gen2, or `adl://` for Azure Data Lake Storage Gen1. If secure transfer is enabled for Azure Storage, the URI is `wasbs://`. For more information, see [secure transfer](../../storage/common/storage-require-secure-transfer.md).
117
118
118
119
* Option 1: An SSH client. For more information, see [Connect to HDInsight (Apache Hadoop) using SSH](../hdinsight-hadoop-linux-use-ssh-unix.md). Most of the steps in this document assume that you're using Beeline from an SSH session to the cluster.
119
120
@@ -172,7 +173,7 @@ This example is based on using the Beeline client from an SSH connection.
172
173
173
174
This information describes the columns in the table.
174
175
175
-
5. Enter the following statements to create a table named **log4jLogs** by using sample data provided with the HDInsight cluster: (Revise as needed based on your [URI scheme](../hdinsight-hadoop-linux-information.md#URI-and-scheme).)
176
+
5. Enter the following statements to create a table named **log4jLogs** by using sample data provided with the HDInsight cluster: (Revise as needed based on your URI scheme.)
176
177
177
178
```hiveql
178
179
DROP TABLE log4jLogs;
@@ -239,7 +240,7 @@ This example is based on using the Beeline client from an SSH connection.
239
240
240
241
## Run a HiveQL file
241
242
242
-
This is a continuation from the prior example. Use the following steps to create a file, then run it using Beeline.
243
+
This example is a continuation from the prior example. Use the following steps to create a file, then run it using Beeline.
243
244
244
245
1. Use the following command to create a file named **query.hql**:
245
246
@@ -295,7 +296,7 @@ This is a continuation from the prior example. Use the following steps to create
295
296
296
297
## Install beeline client
297
298
298
-
Although Beeline is included on the head nodes of your HDInsight cluster, you may want to install it on a local machine. The steps below to install Beeline on a local machine are based on a [Windows Subsystem for Linux](https://docs.microsoft.com/windows/wsl/install-win10).
299
+
Although Beeline is included on the head nodes, you may want to install it locally. The install steps for a local machine are based on a [Windows Subsystem for Linux](https://docs.microsoft.com/windows/wsl/install-win10).
299
300
300
301
1. Update package lists. Enter the following command in your bash shell:
301
302
@@ -311,7 +312,7 @@ Although Beeline is included on the head nodes of your HDInsight cluster, you ma
311
312
sudo apt install openjdk-11-jre-headless
312
313
```
313
314
314
-
1. Open the bashrc file (usually found in ~/.bashrc): `nano ~/.bashrc`.
315
+
1. Open the bashrc file (often found in ~/.bashrc): `nano ~/.bashrc`.
315
316
316
317
1. Amend the bashrc file. Add the following line at the end of the file:
0 commit comments