Skip to content

Commit 7cec8bb

Browse files
committed
freshness_c3
1 parent 2d5fa71 commit 7cec8bb

File tree

1 file changed

+9
-8
lines changed

1 file changed

+9
-8
lines changed

articles/hdinsight/hadoop/apache-hadoop-use-hive-beeline.md

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: hrasheed
66
ms.reviewer: jasonh
77
ms.service: hdinsight
88
ms.topic: conceptual
9-
ms.date: 03/09/2020
9+
ms.date: 04/17/2020
1010
---
1111

1212
# Use the Apache Beeline client with Apache Hive
@@ -36,6 +36,7 @@ beeline -u 'jdbc:hive2://<headnode-FQDN>:10001/;transportMode=http'
3636
```
3737

3838
Replace `<headnode-FQDN>` with the fully qualified domain name of a cluster headnode. To find the fully qualified domain name of a headnode, use the information in the [Manage HDInsight using the Apache Ambari REST API](../hdinsight-hadoop-manage-ambari-rest-api.md#get-the-fqdn-of-cluster-nodes) document.
39+
3940
---
4041

4142
### To HDInsight Enterprise Security Package (ESP) cluster using Kerberos
@@ -79,11 +80,11 @@ Private endpoints point to a basic load balancer, which can only be accessed fro
7980

8081
### Use Beeline with Apache Spark
8182

82-
Apache Spark provides its own implementation of HiveServer2, which is sometimes referred to as the Spark Thrift server. This service uses Spark SQL to resolve queries instead of Hive, and may provide better performance depending on your query.
83+
Apache Spark provides its own implementation of HiveServer2, which is sometimes referred to as the Spark Thrift server. This service uses Spark SQL to resolve queries instead of Hive. And may provide better performance depending on your query.
8384

8485
#### Through public or private endpoints
8586

86-
The connection string used is slightly different. Instead of containing `httpPath=/hive2` it's `httpPath/sparkhive2`. Replace `clustername` with the name of your HDInsight cluster. Replace `admin` with the cluster login account for your cluster. For ESP clusters, use the full UPN (for example, [email protected]). Replace `password` with the password for the cluster login account.
87+
The connection string used is slightly different. Instead of containing `httpPath=/hive2` it uses `httpPath/sparkhive2`. Replace `clustername` with the name of your HDInsight cluster. Replace `admin` with the cluster login account for your cluster. For ESP clusters, use the full UPN (for example, [email protected]). Replace `password` with the password for the cluster login account.
8788

8889
```bash
8990
beeline -u 'jdbc:hive2://clustername.azurehdinsight.net:443/;ssl=true;transportMode=http;httpPath=/sparkhive2' -n admin -p 'password'
@@ -113,7 +114,7 @@ When connecting directly from the cluster head node, or from a resource inside t
113114

114115
* A Hadoop cluster on HDInsight. See [Get Started with HDInsight on Linux](./apache-hadoop-linux-tutorial-get-started.md).
115116

116-
* Notice the [URI scheme](../hdinsight-hadoop-linux-information.md#URI-and-scheme) for your cluster's primary storage. For example, `wasb://` for Azure Storage, `abfs://` for Azure Data Lake Storage Gen2, or `adl://` for Azure Data Lake Storage Gen1. If secure transfer is enabled for Azure Storage, the URI is `wasbs://`. For more information, see [secure transfer](../../storage/common/storage-require-secure-transfer.md).
117+
* Notice the URI scheme for your cluster's primary storage. For example, `wasb://` for Azure Storage, `abfs://` for Azure Data Lake Storage Gen2, or `adl://` for Azure Data Lake Storage Gen1. If secure transfer is enabled for Azure Storage, the URI is `wasbs://`. For more information, see [secure transfer](../../storage/common/storage-require-secure-transfer.md).
117118

118119
* Option 1: An SSH client. For more information, see [Connect to HDInsight (Apache Hadoop) using SSH](../hdinsight-hadoop-linux-use-ssh-unix.md). Most of the steps in this document assume that you're using Beeline from an SSH session to the cluster.
119120

@@ -172,7 +173,7 @@ This example is based on using the Beeline client from an SSH connection.
172173
173174
This information describes the columns in the table.
174175
175-
5. Enter the following statements to create a table named **log4jLogs** by using sample data provided with the HDInsight cluster: (Revise as needed based on your [URI scheme](../hdinsight-hadoop-linux-information.md#URI-and-scheme).)
176+
5. Enter the following statements to create a table named **log4jLogs** by using sample data provided with the HDInsight cluster: (Revise as needed based on your URI scheme.)
176177
177178
```hiveql
178179
DROP TABLE log4jLogs;
@@ -239,7 +240,7 @@ This example is based on using the Beeline client from an SSH connection.
239240
240241
## Run a HiveQL file
241242
242-
This is a continuation from the prior example. Use the following steps to create a file, then run it using Beeline.
243+
This example is a continuation from the prior example. Use the following steps to create a file, then run it using Beeline.
243244
244245
1. Use the following command to create a file named **query.hql**:
245246
@@ -295,7 +296,7 @@ This is a continuation from the prior example. Use the following steps to create
295296
296297
## Install beeline client
297298
298-
Although Beeline is included on the head nodes of your HDInsight cluster, you may want to install it on a local machine. The steps below to install Beeline on a local machine are based on a [Windows Subsystem for Linux](https://docs.microsoft.com/windows/wsl/install-win10).
299+
Although Beeline is included on the head nodes, you may want to install it locally. The install steps for a local machine are based on a [Windows Subsystem for Linux](https://docs.microsoft.com/windows/wsl/install-win10).
299300
300301
1. Update package lists. Enter the following command in your bash shell:
301302
@@ -311,7 +312,7 @@ Although Beeline is included on the head nodes of your HDInsight cluster, you ma
311312
sudo apt install openjdk-11-jre-headless
312313
```
313314
314-
1. Open the bashrc file (usually found in ~/.bashrc): `nano ~/.bashrc`.
315+
1. Open the bashrc file (often found in ~/.bashrc): `nano ~/.bashrc`.
315316
316317
1. Amend the bashrc file. Add the following line at the end of the file:
317318

0 commit comments

Comments
 (0)