Skip to content

Commit fe7439d

Browse files
authored
Merge pull request #89620 from dagiro/cats161
cats161
2 parents b018ba4 + 4497fb0 commit fe7439d

File tree

1 file changed

+17
-17
lines changed

1 file changed

+17
-17
lines changed

articles/hdinsight/spark/apache-spark-eclipse-tool-plugin.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -70,23 +70,23 @@ You can link a normal cluster by using the Ambari managed username. Similarly, f
7070

7171
1. Select **Link a cluster** from **Azure Explorer**.
7272

73-
![link cluster context menu](./media/apache-spark-eclipse-tool-plugin/link-a-cluster-context-menu.png)
73+
![Azure Explorer link cluster menu](./media/apache-spark-eclipse-tool-plugin/link-a-cluster-context-menu.png)
7474

7575
1. Enter **Cluster Name**, **User Name** and **Password**, then click OK button to link cluster. Optionally, enter Storage Account, Storage Key and then select Storage Container for storage explorer to work in the left tree view
7676

77-
![link cluster dialog](./media/apache-spark-eclipse-tool-plugin/link-cluster-dialog1.png)
77+
![Link New HDInsight cluster dialog](./media/apache-spark-eclipse-tool-plugin/link-cluster-dialog1.png)
7878

7979
> [!NOTE]
8080
> We use the linked storage key, username and password if the cluster both logged in Azure subscription and Linked a cluster.
81-
> ![storage explorer in Eclipse](./media/apache-spark-eclipse-tool-plugin/storage-explorer-in-Eclipse.png)
81+
> ![Azure Explorer storage accounts](./media/apache-spark-eclipse-tool-plugin/storage-explorer-in-Eclipse.png)
8282
8383
1. You can see a Linked cluster in **HDInsight** node after clicking OK button, if the input information are right. Now you can submit an application to this linked cluster.
8484

85-
![linked cluster](./media/apache-spark-eclipse-tool-plugin/hdinsight-linked-cluster.png)
85+
![Azure Explorer hdi linked cluster](./media/apache-spark-eclipse-tool-plugin/hdinsight-linked-cluster.png)
8686

8787
1. You also can unlink a cluster from **Azure Explorer**.
8888

89-
![unlinked cluster](./media/apache-spark-eclipse-tool-plugin/hdi-unlinked-cluster.png)
89+
![Azure Explorer unlinked cluster](./media/apache-spark-eclipse-tool-plugin/hdi-unlinked-cluster.png)
9090

9191
## Set up a Spark Scala project for an HDInsight Spark cluster
9292

@@ -98,7 +98,7 @@ You can link a normal cluster by using the Ambari managed username. Similarly, f
9898

9999
1. The Scala project creation wizard automatically detects whether you installed the Scala plug-in. Select **OK** to continue downloading the Scala plug-in, and then follow the instructions to restart Eclipse.
100100

101-
![Scala check](./media/apache-spark-eclipse-tool-plugin/auto-installation-scala2.png)
101+
![Install missing plugin Scala check](./media/apache-spark-eclipse-tool-plugin/auto-installation-scala2.png)
102102

103103
1. In the **New HDInsight Scala Project** dialog box, provide the following values, and then select **Next**:
104104
* Enter a name for the project.
@@ -115,10 +115,10 @@ You can link a normal cluster by using the Ambari managed username. Similarly, f
115115

116116
1. In the **Select a wizard** dialog box, expand **Scala Wizards**, select **Scala Object**, and then select **Next**.
117117

118-
![Select a wizard dialog box](./media/apache-spark-eclipse-tool-plugin/create-scala-project1.png)
118+
![Select a wizard Create a Scala Object](./media/apache-spark-eclipse-tool-plugin/create-scala-project1.png)
119119
1. In the **Create New File** dialog box, enter a name for the object, and then select **Finish**.
120120

121-
![Create New File dialog box](./media/apache-spark-eclipse-tool-plugin/create-scala-project2.png)
121+
![New File Wizard Create New File](./media/apache-spark-eclipse-tool-plugin/create-scala-project2.png)
122122
1. Paste the following code in the text editor:
123123

124124
```scala
@@ -151,11 +151,11 @@ You can link a normal cluster by using the Ambari managed username. Similarly, f
151151
* In the **Main class name** drop-down list, the submission wizard displays all object names from your project. Select or enter one that you want to run. If you selected an artifact from a hard drive, you must enter the main class name manually.
152152
* Because the application code in this example does not require any command-line arguments or reference JARs or files, you can leave the remaining text boxes empty.
153153

154-
![Spark Submission dialog box](./media/apache-spark-eclipse-tool-plugin/create-scala-project3.png)
154+
![Apache Spark Submission dialog box](./media/apache-spark-eclipse-tool-plugin/create-scala-project3.png)
155155

156156
1. The **Spark Submission** tab should start displaying the progress. You can stop the application by selecting the red button in the **Spark Submission** window. You can also view the logs for this specific application run by selecting the globe icon (denoted by the blue box in the image).
157157

158-
![Spark Submission window](./media/apache-spark-eclipse-tool-plugin/create-scala-project4.png)
158+
![Apache Spark Submission window](./media/apache-spark-eclipse-tool-plugin/create-scala-project4.png)
159159

160160
## Access and manage HDInsight Spark clusters by using HDInsight Tools in Azure Toolkit for Eclipse
161161

@@ -165,15 +165,15 @@ You can perform various operations by using HDInsight Tools, including accessing
165165

166166
1. In Azure Explorer, expand **HDInsight**, expand the Spark cluster name, and then select **Jobs**.
167167

168-
![Job view node](./media/apache-spark-eclipse-tool-plugin/eclipse-job-view-node.png)
168+
![Azure Explorer Eclipse job view node](./media/apache-spark-eclipse-tool-plugin/eclipse-job-view-node.png)
169169

170-
1. Select the **Jobs** node. If Java version is lower than **1.8**, HDInsight Tools automatically reminder you install the **E(fx)clipse** plug-in. Select **OK** to continue, and then follow the wizard to install it from the Eclipse Marketplace and restart Eclipse.
170+
1. Select the **Jobs** node. If Java version is lower than **1.8**, HDInsight Tools automatically reminder you install the **E(fx)clipse** plug-in. Select **OK** to continue, and then follow the wizard to install it from the Eclipse Marketplace and restart Eclipse.
171171

172-
![Install E(fx)clipse](./media/apache-spark-eclipse-tool-plugin/auto-install-efxclipse.png)
172+
![Install missing plugin E(fx)clipse](./media/apache-spark-eclipse-tool-plugin/auto-install-efxclipse.png)
173173

174174
1. Open the Job View from the **Jobs** node. In the right pane, the **Spark Job View** tab displays all the applications that were run on the cluster. Select the name of the application for which you want to see more details.
175175

176-
![Application details](./media/apache-spark-eclipse-tool-plugin/eclipse-view-job-logs.png)
176+
![Apache Eclipse view job logs details](./media/apache-spark-eclipse-tool-plugin/eclipse-view-job-logs.png)
177177

178178
You can then take any of these actions:
179179

@@ -183,7 +183,7 @@ You can perform various operations by using HDInsight Tools, including accessing
183183

184184
* Select the **Log** tab to view frequently used logs, including **Driver Stderr**, **Driver Stdout**, and **Directory Info**.
185185

186-
![Log details](./media/apache-spark-eclipse-tool-plugin/eclipse-job-log-info.png)
186+
![Apache Spark Eclipse job log info](./media/apache-spark-eclipse-tool-plugin/eclipse-job-log-info.png)
187187

188188
* Open the Spark history UI and the Apache Hadoop YARN UI (at the application level) by selecting the hyperlinks at the top of the window.
189189

@@ -240,7 +240,7 @@ To resolve this error, you need [download the executable](https://public-repo-1.
240240

241241
1. The template adds a sample code (**LogQuery**) under the **src** folder that you can run locally on your computer.
242242

243-
![Location of LogQuery](./media/apache-spark-eclipse-tool-plugin/local-scala-application.png)
243+
![Location of LogQuery local scala application](./media/apache-spark-eclipse-tool-plugin/local-scala-application.png)
244244

245245
1. Right-click the **LogQuery** application, point to **Run As**, and then select **1 Scala Application**. Output like this appears on the **Console** tab:
246246

@@ -303,7 +303,7 @@ When users submit job to a cluster with reader-only role permission, Ambari cred
303303

304304
When link a cluster, I would suggest you to provide credential of storage.
305305

306-
![Interactive sign-in](./media/apache-spark-eclipse-tool-plugin/link-cluster-with-storage-credential-eclipse.png)
306+
![link cluster with storage credential eclipse](./media/apache-spark-eclipse-tool-plugin/link-cluster-with-storage-credential-eclipse.png)
307307

308308
There are two modes to submit the jobs. If storage credential is provided, batch mode will be used to submit the job. Otherwise, interactive mode will be used. If the cluster is busy, you might get the error below.
309309

0 commit comments

Comments
 (0)