You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
3. Select **Install** for the Scala plugin that is featured in the new window.
48
48
@@ -75,7 +75,7 @@ Perform the following steps to install the Scala plugin:
75
75
|Project SDK| This might be blank on your first use of IDEA. Select **New...** and navigate to your JDK.|
76
76
|Spark Version|The creation wizard integrates the proper version for Spark SDK and Scala SDK. If the Spark cluster version is earlier than 2.0, select **Spark 1.x**. Otherwise, select **Spark2.x**. This example uses **Spark 2.3.0 (Scala 2.11.8)**.|
77
77
78
-

78
+

79
79
80
80
7. Select **Finish**. It may take a few minutes before the project becomes available.
81
81
@@ -87,7 +87,7 @@ Perform the following steps to install the Scala plugin:
87
87
88
88
c. Select **Cancel** after viewing the artifact.
89
89
90
-

90
+

91
91
92
92
9. Add your application source code by doing the following:
93
93
@@ -129,52 +129,52 @@ User can either [sign in to Azure subscription](#sign-in-to-your-azure-subscript
129
129
### Sign in to your Azure subscription
130
130
131
131
1. From the menu bar, navigate to **View** > **Tool Windows** > **Azure Explorer**.
3. In the **Azure Sign In** dialog box, choose **Device Login**, and then select **Sign in**.
140
140
141
-

141
+

142
142
143
143
4. In the **Azure Device Login** dialog box, click **Copy&Open**.
144
-
145
-

144
+
145
+

146
146
147
147
5. In the browser interface, paste the code, and then click **Next**.
148
-
149
-

148
+
149
+

150
150
151
151
6. Enter your Azure credentials, and then close the browser.
152
-
153
-

152
+
153
+

154
154
155
155
7. After you're signed in, the **Select Subscriptions** dialog box lists all the Azure subscriptions that are associated with the credentials. Select your subscription and then select the **Select** button.
You can link an HDInsight cluster by using the Apache Ambari managed username. Similarly, for a domain-joined HDInsight cluster, you can link by using the domain and username, such as `[email protected]`. Also you can link Livy Service cluster.
170
170
171
171
1. From the menu bar, navigate to **View** > **Tool Windows** > **Azure Explorer**.
172
172
173
-
2. From Azure Explorer, right-click the **HDInsight** node, and then select **Link A Cluster**.
173
+
1. From Azure Explorer, right-click the **HDInsight** node, and then select **Link A Cluster**.
3. The available options in the **Link A Cluster** window will vary depending on which value you select from the **Link Resource Type** drop-down list. Enter your values and then select **OK**.
177
+
1. The available options in the **Link A Cluster** window will vary depending on which value you select from the **Link Resource Type** drop-down list. Enter your values and then select **OK**.
178
178
179
179
***HDInsight Cluster**
180
180
@@ -185,7 +185,7 @@ You can link an HDInsight cluster by using the Apache Ambari managed username. S
185
185
|Authentication Type| Leave as **Basic Authentication**|
186
186
|User Name| Enter cluster user name, default is admin.|
@@ -215,7 +215,7 @@ You can link an HDInsight cluster by using the Apache Ambari managed username. S
215
215
After creating a Scala application, you can submit it to the cluster.
216
216
217
217
1. From Project, navigate to **myApp** > **src** > **main** > **scala** > **myApp**. Right-click **myApp**, and select **Submit Spark Application** (It will likely be located at the bottom of the list).
218
-
218
+
219
219

220
220
221
221
2. In the **Submit Spark Application** dialog window, select **1. Spark on HDInsight**.
@@ -239,7 +239,7 @@ After creating a Scala application, you can submit it to the cluster.
4. Select **SparkJobRun** to submit your project to the selected cluster. The **Remote Spark Job in Cluster** tab displays the job execution progress at the bottom. You can stop the application by clicking the red button. To learn how to access the job output, see the "Access and manage HDInsight Spark clusters by using Azure Toolkit for IntelliJ" section later in this article.
2. In the right pane, the **Spark Job View** tab displays all the applications that were run on the cluster. Select the name of the application for which you want to see more details.
3. To display basic running job information, hover over the job graph. To view the stages graph and information that every job generates, select a node on the job graph.
5. You can also view the Spark history UI and the YARN UI (at the application level) by selecting a link at the top of the window.
272
272
@@ -321,9 +321,9 @@ Ensure you have satisfied the WINUTILS.EXE prerequisite.
321
321
322
322
7. Then two dialogs may be displayed to ask you if you want to auto fix dependencies. If so, select **Auto Fix**.
323
323
324
-

324
+

325
325
326
-

326
+

327
327
328
328
8. The console should look similar to the picture below. In the console window type `sc.appName`, and then press ctrl+Enter. The result will be shown. You can terminate the local console by clicking red button.
329
329
@@ -369,50 +369,49 @@ When users submit job to a cluster with reader-only role permission, Ambari cred
369
369
### Link cluster from context menu
370
370
371
371
1. Sign in with reader-only role account.
372
-
372
+
373
373
2. From **Azure Explorer**, expand **HDInsight** to view HDInsight clusters that are in your subscription. The clusters marked **"Role:Reader"** only have reader-only role permission.
374
374
375
-

375
+

376
376
377
377
3. Right-click the cluster with reader-only role permission. Select **Link this cluster** from context menu to link cluster. Enter the Ambari username and Password.
378
378
379
-
380
-

379
+

381
380
382
381
4. If the cluster is linked successfully, HDInsight will be refreshed.
383
382
The stage of the cluster will become linked.
384
383
385
-

384
+


391
+
392
+

394
393
395
394
### Link cluster from Run/Debug Configurations window
396
395
397
396
1. Create an HDInsight Configuration. Then select **Remotely Run in Cluster**.
398
-
397
+
399
398
2. Select a cluster, which has reader-only role permission for **Spark clusters(Linux only)**. Warning message shows out. You can Click **Link this cluster** to link cluster.
* For clusters with reader-only role permission, click **Storage Accounts** node, **Storage Access Denied** window pops up. You can click **Open Azure Storage Explorer** to open Storage Explorer.
406
-
407
-

408
405
409
-

406
+

407
+
408
+

410
409
411
410
* For linked clusters, click **Storage Accounts** node, **Storage Access Denied** window pops up. You can click **Open Azure Storage** to open Storage Explorer.
412
-
413
-

414
411
415
-

412
+

413
+
414
+

416
415
417
416
## Convert existing IntelliJ IDEA applications to use Azure Toolkit for IntelliJ
418
417
@@ -421,11 +420,11 @@ You can convert the existing Spark Scala applications that you created in Intell
421
420
1. For an existing Spark Scala application that was created through IntelliJ IDEA, open the associated .iml file.
422
421
423
422
2. At the root level is a **module** element like the following:
3. Save the changes. Your application should now be compatible with Azure Toolkit for IntelliJ. You can test it by right-clicking the project name in Project. The pop-up menu now has the option **Submit Spark Application to HDInsight**.
0 commit comments