You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/hdinsight-hadoop-emulator-visual-studio.md
+5-6Lines changed: 5 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,13 +2,12 @@
2
2
title: Data Lake tools for Visual Studio with Hortonworks Sandbox - Azure HDInsight
3
3
description: Learn how to use the Azure Data Lake tools for Visual Studio with the Hortonworks sandbox running in a local VM. With these tools, you can create and run Hive and Pig jobs on the sandbox, and view job output and history.
4
4
author: hrasheed-msft
5
+
ms.author: hrasheed
5
6
ms.reviewer: jasonh
6
-
7
7
ms.service: hdinsight
8
8
ms.custom: hdinsightactive
9
9
ms.topic: conceptual
10
10
ms.date: 05/07/2018
11
-
ms.author: hrasheed
12
11
---
13
12
14
13
# Use the Azure Data Lake tools for Visual Studio with the Hortonworks Sandbox
@@ -37,7 +36,7 @@ Make sure that the Hortonworks Sandbox is running. Then follow the steps in the
37
36
38
37
2. From **Server Explorer**, right-click the **HDInsight** entry, and then select **Connect to HDInsight Emulator**.
39
38
40
-

39
+

41
40
42
41
3. From the **Connect to HDInsight Emulator** dialog box, enter the password that you configured for Ambari.
43
42
@@ -108,7 +107,7 @@ Hive provides a SQL-like query language (HiveQL) for working with structured dat
108
107
> [!NOTE]
109
108
> The information is the same that is available from the **Job Log** link after a job has finished.
110
109
111
-

110
+

112
111
113
112
## Create a Hive project
114
113
@@ -118,7 +117,7 @@ You can also create a project that contains multiple Hive scripts. Use a project
118
117
119
118
2. From the list of projects, expand **Templates**, expand **Azure Data Lake**, and then select **HIVE (HDInsight)**. From the list of templates, select **Hive Sample**. Enter a name and location, and then select **OK**.
120
119
121
-

120
+

122
121
123
122
The **Hive Sample** project contains two scripts, **WebLogAnalysis.hql** and **SensorDataAnalysis.hql**. You can submit these scripts by using the same **Submit** button at the top of the window.
124
123
@@ -175,7 +174,7 @@ Data Lake tools also allow you to easily view information about jobs that have b
175
174
176
175
2. Expanding a table displays the columns for that table. To quickly view the data, right-click a table, and select **View Top 100 Rows**.
177
176
178
-

177
+

Copy file name to clipboardExpand all lines: articles/hdinsight/hdinsight-hadoop-hue-linux.md
+21-24Lines changed: 21 additions & 24 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,20 +3,20 @@ title: Hue with Hadoop on HDInsight Linux-based clusters - Azure
3
3
description: Learn how to install Hue on HDInsight clusters and use tunneling to route the requests to Hue. Use Hue to browse storage and run Hive or Pig.
4
4
keywords: hue hadoop
5
5
author: hrasheed-msft
6
+
ms.date: 12/11/2017
7
+
ms.author: hrasheed
6
8
ms.reviewer: jasonh
7
-
8
9
ms.service: hdinsight
9
10
ms.custom: hdinsightactive,hdiseo17may2017
10
11
ms.topic: conceptual
11
-
ms.date: 12/11/2017
12
-
ms.author: hrasheed
13
-
14
12
---
13
+
15
14
# Install and use Hue on HDInsight Hadoop clusters
16
15
17
16
Learn how to install Hue on HDInsight clusters and use tunneling to route the requests to Hue.
18
17
19
18
## What is Hue?
19
+
20
20
Hue is a set of Web applications used to interact with an Apache Hadoop cluster. You can use Hue to browse the storage associated with a Hadoop cluster (WASB, in the case of HDInsight clusters), run Hive jobs and Pig scripts, and so on. The following components are available with Hue installations on an HDInsight Hadoop cluster.
21
21
22
22
* Beeswax Hive Editor
@@ -30,8 +30,6 @@ Hue is a set of Web applications used to interact with an Apache Hadoop cluster.
30
30
> Components provided with the HDInsight cluster are fully supported and Microsoft Support will help to isolate and resolve issues related to these components.
31
31
>
32
32
> Custom components receive commercially reasonable support to help you to further troubleshoot the issue. This might result in resolving the issue OR asking you to engage available channels for the open source technologies where deep expertise for that technology is found. For example, there are many community sites that can be used, like: [MSDN forum for HDInsight](https://social.msdn.microsoft.com/Forums/azure/en-US/home?forum=hdinsight), [https://stackoverflow.com](https://stackoverflow.com). Also Apache projects have project sites on [https://apache.org](https://apache.org), for example: [Hadoop](https://hadoop.apache.org/).
33
-
>
34
-
>
35
33
36
34
## Install Hue using Script Actions
37
35
@@ -41,16 +39,13 @@ This section provides instructions about how to use the script when provisioning
41
39
42
40
> [!NOTE]
43
41
> Azure PowerShell, the Azure Classic CLI, the HDInsight .NET SDK, or Azure Resource Manager templates can also be used to apply script actions. You can also apply script actions to already running clusters. For more information, see [Customize HDInsight clusters with Script Actions](hdinsight-hadoop-customize-cluster-linux.md).
44
-
>
45
-
>
46
42
47
43
1. Start provisioning a cluster by using the steps in [Provision HDInsight clusters on Linux](hdinsight-hadoop-provision-linux-clusters.md), but do not complete provisioning.
48
44
49
45
> [!NOTE]
50
46
> To install Hue on HDInsight clusters, the recommended headnode size is at least A4 (8 cores, 14 GB memory).
51
-
>
52
-
>
53
-
2. On the **Optional Configuration** blade, select **Script Actions**, and provide the information as shown below:
47
+
48
+
1. On the **Optional Configuration** blade, select **Script Actions**, and provide the information as shown below:
54
49
55
50

56
51
@@ -60,17 +55,17 @@ This section provides instructions about how to use the script when provisioning
60
55
***WORKER**: Leave this blank.
61
56
***ZOOKEEPER**: Leave this blank.
62
57
***PARAMETERS**: Leave this blank.
63
-
3. At the bottom of the **Script Actions**, use the **Select** button to save the configuration. Finally, use the **Select** button at the bottom of the **Optional Configuration** blade to save the optional configuration information.
64
-
4. Continue provisioning the cluster as described in [Provision HDInsight clusters on Linux](hdinsight-hadoop-provision-linux-clusters.md).
58
+
59
+
1. At the bottom of the **Script Actions**, use the **Select** button to save the configuration. Finally, use the **Select** button at the bottom of the **Optional Configuration** blade to save the optional configuration information.
60
+
61
+
1. Continue provisioning the cluster as described in [Provision HDInsight clusters on Linux](hdinsight-hadoop-provision-linux-clusters.md).
65
62
66
63
## Use Hue with HDInsight clusters
67
64
68
65
SSH Tunneling is the only way to access Hue on the cluster once it is running. Tunneling via SSH allows the traffic to go directly to the headnode of the cluster where Hue is running. After the cluster has finished provisioning, use the following steps to use Hue on an HDInsight Linux cluster.
69
66
70
67
> [!NOTE]
71
68
> We recommend using Firefox web browser to follow the instructions below.
72
-
>
73
-
>
74
69
75
70
1. Use the information in [Use SSH Tunneling to access Apache Ambari web UI, ResourceManager, JobHistory, NameNode, Oozie, and other web UI's](hdinsight-linux-ambari-ssh-tunnel.md) to create an SSH tunnel from your client system to the HDInsight cluster, and then configure your Web browser to use the SSH tunnel as a proxy.
76
71
@@ -87,38 +82,39 @@ SSH Tunneling is the only way to access Hue on the cluster once it is running. T
This is the hostname of the primary headnode where the Hue website is located.
85
+
90
86
4. Use the browser to open the Hue portal at http:\//HOSTNAME:8888. Replace HOSTNAME with the name you obtained in the previous step.
91
87
92
88
> [!NOTE]
93
89
> When you log in for the first time, you will be prompted to create an account to log in to the Hue portal. The credentials you specify here will be limited to the portal and are not related to the admin or SSH user credentials you specified while provision the cluster.
94
-
>
95
-
>
96
90
97
-

91
+

98
92
99
93
### Run a Hive query
94
+
100
95
1. From the Hue portal, click **Query Editors**, and then click **Hive** to open the Hive editor.

103
98
2. On the **Assist** tab, under **Database**, you should see **hivesampletable**. This is a sample table that is shipped with all Hadoop clusters on HDInsight. Enter a sample query in the right pane and see the output on the **Results** tab in the pane below, as shown in the screen capture.
You can also use the **Chart** tab to see a visual representation of the result.
108
103
109
104
### Browse the cluster storage
105
+
110
106
1. From the Hue portal, click **File Browser** in the top-right corner of the menu bar.
111
107
2. By default the file browser opens at the **/user/myuser** directory. Click the forward slash right before the user directory in the path to go to the root of the Azure storage container associated with the cluster.
3. Right-click on a file or folder to see the available operations. Use the **Upload** button in the right corner to upload files to the current directory. Use the **New** button to create new files or directories.
115
112
116
113
> [!NOTE]
117
114
> The Hue file browser can only show the contents of the default container associated with the HDInsight cluster. Any additional storage accounts/containers that you might have associated with the cluster will not be accessible using the file browser. However, the additional containers associated with the cluster will always be accessible for the Hive jobs. For example, if you enter the command `dfs -ls wasb://[email protected]` in the Hive editor, you can see the contents of additional containers as well. In this command, **newcontainer** is not the default container associated with a cluster.
118
-
>
119
-
>
120
115
121
116
## Important considerations
117
+
122
118
1. The script used to install Hue installs it only on the primary headnode of the cluster.
123
119
124
120
2. During installation, multiple Hadoop services (HDFS, YARN, MR2, Oozie) are restarted for updating the configuration. After the script finishes installing Hue, it might take some time for other Hadoop services to start up. This might affect Hue's performance initially. Once all services start up, Hue will be fully functional.
@@ -128,12 +124,13 @@ SSH Tunneling is the only way to access Hue on the cluster once it is running. T
128
124
129
125
4. With Linux clusters, you can have a scenario where your services are running on the primary headnode while the Resource Manager could be running on the secondary. Such a scenario might result in errors (shown below) when using Hue to view details of RUNNING jobs on the cluster. However, you can view the job details when the job has completed.
This is due to a known issue. As a workaround, modify Ambari so that the active Resource Manager also runs on the primary headnode.
134
130
5. Hue understands WebHDFS while HDInsight clusters use Azure Storage using `wasb://`. So, the custom script used with script action installs WebWasb, which is a WebHDFS-compatible service for talking to WASB. So, even though the Hue portal says HDFS in places (like when you move your mouse over the **File Browser**), it should be interpreted as WASB.
135
131
136
132
## Next steps
133
+
137
134
*[Install Apache Giraph on HDInsight clusters](hdinsight-hadoop-giraph-install-linux.md). Use cluster customization to install Giraph on HDInsight Hadoop clusters. Giraph allows you to perform graph processing using Hadoop, and it can be used with Azure HDInsight.
138
135
*[Install R on HDInsight clusters](hdinsight-hadoop-r-scripts-linux.md). Use cluster customization to install R on HDInsight Hadoop clusters. R is an open-source language and environment for statistical computing. It provides hundreds of built-in statistical functions and its own programming language that combines aspects of functional and object-oriented programming. It also provides extensive graphical capabilities.
0 commit comments