Skip to content

Commit 14ad08d

Browse files
authored
Improved Correctness score
Improved Correctness score
1 parent 470228d commit 14ad08d

File tree

1 file changed

+11
-11
lines changed

1 file changed

+11
-11
lines changed

articles/hdinsight/hbase/apache-hbase-tutorial-get-started-linux.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Follow this Apache HBase tutorial to start using hadoop on HDInsigh
44
ms.service: azure-hdinsight
55
ms.topic: tutorial
66
ms.custom: hdinsightactive, linux-related-content
7-
ms.date: 05/10/2024
7+
ms.date: 12/23/2024
88
---
99

1010
# Tutorial: Use Apache HBase in Azure HDInsight
@@ -24,13 +24,13 @@ In this tutorial, you learn how to:
2424

2525
* An SSH client. For more information, see [Connect to HDInsight (Apache Hadoop) using SSH](../hdinsight-hadoop-linux-use-ssh-unix.md).
2626

27-
* Bash. The examples in this article use the Bash shell on Windows 10 for the curl commands. See [Windows Subsystem for Linux Installation Guide for Windows 10](/windows/wsl/install-win10) for installation steps. Other [Unix shells](https://www.gnu.org/software/bash/) will work as well. The curl examples, with some slight modifications, can work on a Windows Command prompt. Or you can use the Windows PowerShell cmdlet [Invoke-RestMethod](/powershell/module/microsoft.powershell.utility/invoke-restmethod).
27+
* Bash. The examples in this article use the Bash shell on Windows 10 for the curl commands. See [Windows Subsystem for Linux Installation Guide for Windows 10](/windows/wsl/install-win10) for installation steps. Other [Unix shells](https://www.gnu.org/software/bash/) work as well. The curl examples, with some slight modifications, can work on a Windows Command prompt. Or you can use the Windows PowerShell cmdlet [Invoke-RestMethod](/powershell/module/microsoft.powershell.utility/invoke-restmethod).
2828

2929
## Create Apache HBase cluster
3030

31-
The following procedure uses an Azure Resource Manager template to create an HBase cluster. The template also creates the dependent default Azure Storage account. To understand the parameters used in the procedure and other cluster creation methods, see [Create Linux-based Hadoop clusters in HDInsight](../hdinsight-hadoop-provision-linux-clusters.md).
31+
The following procedure uses an Azure Resource Manager template to create a HBase cluster. The template also creates the dependent default Azure Storage account. To understand the parameters used in the procedure and other cluster creation methods, see [Create Linux-based Hadoop clusters in HDInsight](../hdinsight-hadoop-provision-linux-clusters.md).
3232

33-
1. Select the following image to open the template in the Azure portal. The template is located in [Azure quickstart templates](https://azure.microsoft.com/resources/templates/).
33+
1. Select the following image to open the template in the Azure portal. The template is located in [Azure Quickstart templates](https://azure.microsoft.com/resources/templates/).
3434

3535
<a href="https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.hdinsight%2Fhdinsight-hbase-linux%2Fazuredeploy.json" target="_blank"><img src="./media/apache-hbase-tutorial-get-started-linux/hdi-deploy-to-azure1.png" alt="Deploy to Azure button for new cluster"></a>
3636

@@ -51,7 +51,7 @@ The following procedure uses an Azure Resource Manager template to create an HBa
5151

5252
3. Select **I agree to the terms and conditions stated above**, and then select **Purchase**. It takes about 20 minutes to create a cluster.
5353

54-
After an HBase cluster is deleted, you can create another HBase cluster by using the same default blob container. The new cluster picks up the HBase tables you created in the original cluster. To avoid inconsistencies, we recommend that you disable the HBase tables before you delete the cluster.
54+
After a HBase cluster is deleted, you can create another HBase cluster by using the same default blob container. The new cluster picks up the HBase tables you created in the original cluster. To avoid inconsistencies, we recommend that you disable the HBase tables before you delete the cluster.
5555

5656
## Create tables and insert data
5757

@@ -67,7 +67,7 @@ In HBase (an implementation of [Cloud BigTable](https://cloud.google.com/bigtabl
6767

6868
**To use the HBase shell**
6969

70-
1. Use `ssh` command to connect to your HBase cluster. Edit the command below by replacing `CLUSTERNAME` with the name of your cluster, and then enter the command:
70+
1. Use `ssh` command to connect to your HBase cluster. Edit the following command by replacing `CLUSTERNAME` with the name of your cluster, and then enter the command:
7171

7272
```cmd
7373
@@ -79,7 +79,7 @@ In HBase (an implementation of [Cloud BigTable](https://cloud.google.com/bigtabl
7979
hbase shell
8080
```
8181
82-
1. Use `create` command to create an HBase table with two-column families. The table and column names are case-sensitive. Enter the following command:
82+
1. Use `create` command to create a HBase table with two-column families. The table and column names are case-sensitive. Enter the following command:
8383
8484
```hbaseshell
8585
create 'Contacts', 'Personal', 'Office'
@@ -204,15 +204,15 @@ You can query data in HBase tables by using [Apache Hive](https://hive.apache.or
204204
The Hive query to access HBase data need not be executed from the HBase cluster. Any cluster that comes with Hive (including Spark, Hadoop, HBase, or Interactive Query) can be used to query HBase data, provided the following steps are completed:
205205
206206
1. Both clusters must be attached to the same Virtual Network and Subnet
207-
2. Copy `/usr/hdp/$(hdp-select --version)/hbase/conf/hbase-site.xml` from the HBase cluster headnodes to the Hive cluster headnodes and workernodes.
207+
2. Copy `/usr/hdp/$(hdp-select --version)/hbase/conf/hbase-site.xml` from the HBase cluster headnodes to the Hive cluster headnodes and worker nodes.
208208
209209
### Secure Clusters
210210
211211
HBase data can also be queried from Hive using ESP-enabled HBase:
212212
213213
1. When following a multi-cluster pattern, both clusters must be ESP-enabled.
214214
2. To allow Hive to query HBase data, make sure that the `hive` user is granted permissions to access the HBase data via the Hbase Apache Ranger plugin
215-
3. When using separate, ESP-enabled clusters, the contents of `/etc/hosts` from the HBase cluster headnodes must be appended to `/etc/hosts` of the Hive cluster headnodes and workernodes.
215+
3. When you use separate, ESP-enabled clusters, the contents of `/etc/hosts` from the HBase cluster headnodes must be appended to `/etc/hosts` of the Hive cluster headnodes and worker nodes.
216216
> [!NOTE]
217217
> After scaling either clusters, `/etc/hosts` must be appended again
218218
@@ -245,7 +245,7 @@ The HBase REST API is secured via [basic authentication](https://en.wikipedia.or
245245
fi
246246
```
247247
248-
1. Set environment variable for ease of use. Edit the commands below by replacing `MYPASSWORD` with the cluster login password. Replace `MYCLUSTERNAME` with the name of your HBase cluster. Then enter the commands.
248+
1. Set environment variable for ease of use. Edit the following commands by replacing `MYPASSWORD` with the cluster login password. Replace `MYCLUSTERNAME` with the name of your HBase cluster. Then enter the commands.
249249
250250
```bash
251251
export PASSWORD='MYPASSWORD'
@@ -307,7 +307,7 @@ For more information about HBase Rest, see [Apache HBase Reference Guide](https:
307307
> [!NOTE]
308308
> Thrift is not supported by HBase in HDInsight.
309309
>
310-
> When using Curl or any other REST communication with WebHCat, you must authenticate the requests by providing the user name and password for the HDInsight cluster administrator. You must also use the cluster name as part of the Uniform Resource Identifier (URI) used to send the requests to the server:
310+
> When you use Curl or any other REST communication with WebHCat, you must authenticate the requests by providing the user name and password for the HDInsight cluster administrator. You must also use the cluster name as part of the Uniform Resource Identifier (URI) used to send the requests to the server:
311311
>
312312
> `curl -u <UserName>:<Password> \`
313313
>

0 commit comments

Comments
 (0)