You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/active-directory/conditional-access/howto-baseline-protect-azure.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -47,7 +47,7 @@ If the CLI can open your default browser, it will do so and load a sign-in page.
47
47
48
48
## Deployment considerations
49
49
50
-
Because the **Require MFA for service management** policy applies to all Azure Resource Manager users, several considerations need to be made to ensure a smooth deployment. These considerations include identifying users and service principles in Azure AD that cannot or should not perform MFA, as well as applications and clients used by your organization that do not support modern authentication.
50
+
The **Require MFA for service management** policy applies to all Azure Resource Manager users.
summary: Learn how to use Chef Software products to deploy and manage Azure resources.
5
+
4
6
metadata:
5
-
title: Chef Software on Azure documentation - Tutorials, samples, reference, and resources
6
-
description: Chef Software provides a DevOps automation platform for Linux and Windows that enables the management of both physical and virtual server configurations. Learn how to use Chef Software products to automate infrastructure as code (IaC) recipes and manage and deploy virtual machines on Azure with quickstarts and tutorials.
7
+
title: Chef Software on Azure documentation
8
+
description: Learn how to use Chef Software products to deploy and manage Azure resources.
description: Chef Software provides a DevOps automation platform for Linux and Windows that enables the management of both physical and virtual server configurations. Learn how to use Chef Software products to automate infrastructure as code (IaC) recipes and manage and deploy virtual machines on Azure with quickstarts and tutorials.
16
-
aside:
17
-
image:
18
-
alt: Chef logo
19
-
height: 133
20
-
width: 48
21
-
src: media/chef.png
22
-
title:
23
-
sections:
24
-
- title: 5-Minute Quickstarts
25
-
items:
26
-
- type: paragraph
27
-
text: Install and configure Chef, and use it to create a Linux virtual machine in Azure.
28
-
- type: list
29
-
style: icon48
30
-
items:
31
-
- image:
32
-
src: /azure/media/index/azure_dev-9.svg
33
-
text: Install and configure Chef
34
-
href: ./chef-extension-portal.md
35
-
- title: Step-by-Step Tutorials
36
-
items:
37
-
- type: paragraph
38
-
text: Learn how to use Chef Software tools to create and manage Azure compute and networking infrastructure.
39
-
- type: list
40
-
style: unordered
41
-
items:
42
-
- html: <a href="/azure/virtual-machines/windows/chef-automation">Automate Azure Virtual Machine deployment with Chef</a>.
Copy file name to clipboardExpand all lines: articles/hdinsight/hdinsight-for-vscode.md
+28-34Lines changed: 28 additions & 34 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,6 @@ Learn how to use Spark & Hive Tools for Visual Studio Code to create and submit
15
15
16
16
Spark & Hive Tools can be installed on platforms that are supported by Visual Studio Code, which include Windows, Linux, and macOS. Note the following prerequisites for different platforms.
17
17
18
-
19
18
## Prerequisites
20
19
21
20
The following items are required for completing the steps in this article:
@@ -42,7 +41,6 @@ After you meet the prerequisites, you can install Spark & Hive Tools for Visual
42
41
43
42
5. Select **Reload** when necessary.
44
43
45
-
46
44
## Open a work folder
47
45
48
46
To open a work folder and to create a file in Visual Studio Code, follow these steps:
@@ -51,14 +49,14 @@ To open a work folder and to create a file in Visual Studio Code, follow these s
51
49
52
50
2. In **Explorer** view, select the **HDexample** folder, and then select the **New File** icon next to the work folder:

55
53
56
54
3. Name the new file by using either the `.hql` (Hive queries) or the `.py` (Spark script) file extension. This example uses **HelloWorld.hql**.
57
55
58
56
## Set the Azure environment
59
57
60
58
For a national cloud user, follow these steps to set the Azure environment first, and then use the **Azure: Sign In** command to sign in to Azure:
61
-
59
+
62
60
1. Select **File\Preferences\Settings**.
63
61
2. Search on the following string: **Azure: Cloud**
64
62
3. Select the national cloud from the list:
@@ -74,7 +72,6 @@ Before you can submit scripts to your clusters from Visual Studio Code, you must
74
72

75
73
76
74
2. Follow the sign-in instructions to sign in to Azure. After you're connected, your Azure account name is shown on the status bar at the bottom of the Visual Studio Code window.
77
-
78
75
79
76
## Link a cluster
80
77
@@ -84,7 +81,7 @@ You can link a normal cluster by using an [Apache Ambari](https://ambari.apache.
84
81
85
82
1. From the menu bar, go to **View** > **Command Palette**, and enter **Spark / Hive: Link a Cluster**.
## Submit interactive Hive queries and Hive batch scripts
147
142
148
143
With Spark & Hive Tools for Visual Studio Code, you can submit interactive Hive queries and Hive batch scripts to your clusters.
@@ -151,7 +146,6 @@ With Spark & Hive Tools for Visual Studio Code, you can submit interactive Hive
151
146
152
147
2. Select the **HelloWorld.hql** file that was created [earlier](#open-a-work-folder). It opens in the script editor.
153
148
154
-
155
149
3. Copy and paste the following code into your Hive file, and then save it:
156
150
157
151
```hiveql
@@ -164,7 +158,7 @@ With Spark & Hive Tools for Visual Studio Code, you can submit interactive Hive
164
158
165
159
6. If you haven't specified a default cluster, select a cluster. The tools also let you submit a block of code instead of the whole script file by using the context menu. After a few moments, the query results appear in a new tab:
The submission status appears on the left of the lower status bar when you're running queries. Don't submit other queries when the status is **PySpark Kernel (busy)**.
215
210
216
-
> [!NOTE]
211
+
> [!NOTE]
217
212
>
218
213
> When **Python Extension Enabled** is cleared in the settings (it's selected by default), the submitted pyspark interaction results will use the old window:

259
253
260
254
After you submit a Python job, submission logs appear in the **OUTPUT** window in Visual Studio Code. The Spark UIURLand Yarn UIURL are also shown. You can open the URLin a web browser to track the job status.
261
255
@@ -266,27 +260,27 @@ After you submit a Python job, submission logs appear in the **OUTPUT** window i
266
260
<a id="triggerlivyconf"></a>**How to trigger Livy configuration**
267
261
268
262
Method 1
269
-
1. From the menu bar, go to **File**>**Preferences**>**Settings**.
263
+
1. From the menu bar, go to **File**>**Preferences**>**Settings**.
270
264
2. In the **Search settings** box, enter **HDInsight Job Submission: Livy Conf**.
271
265
3. Select **Edit in settings.json**for the relevant search result.
272
266
273
-
Method 2
267
+
Method 2
274
268
Submit a file, and notice that the .vscode folder is automatically added to the work folder. You can see the Livy configuration by selecting **.vscode\settings.json**.
You can preview Hive Table in your clusters directly through the **Azure HDInsight** explorer:
@@ -352,7 +345,7 @@ You can preview Hive Table in your clusters directly through the **Azure HDInsig
352
345
6. The **Preview Results** window opens:
353
346
354
347

355
-
348
+
356
349
-RESULTS panel
357
350
358
351
You can save the whole result as a CSV, JSON, or Excel file to a local path, or just select multiple lines.
@@ -373,7 +366,8 @@ Spark & Hive for Visual Studio Code also supports the following features:
373
366
374
367
-**IntelliSense autocomplete**. Suggestions pop up for keywords, methods, variables, and other programming elements. Different icons represent different types of objects:
375
368
376
-

369
+

370
+
377
371
-**IntelliSense error marker**. The language service underlines editing errors in the Hive script.
378
372
-**Syntax highlights**. The language service uses different colors to differentiate variables, keywords, data type, functions, and other programming elements:
379
373
@@ -385,16 +379,16 @@ Users who are assigned the reader-only role for the cluster can no longer submit
385
379
386
380
### Browse the HDInsight cluster
387
381
388
-
When you select the Azure HDInsight explorer to expand an HDInsight cluster, you're prompted to link the cluster if you have the reader-only role for the cluster. Use the following method to link to the cluster by using your Ambari credentials.
382
+
When you select the Azure HDInsight explorer to expand an HDInsight cluster, you're prompted to link the cluster if you have the reader-only role for the cluster. Use the following method to link to the cluster by using your Ambari credentials.
389
383
390
384
### Submit the job to the HDInsight cluster
391
385
392
386
When submitting job to an HDInsight cluster, you're prompted to link the cluster if you're in the reader-only role for the cluster. Use the following steps to link to the cluster by using Ambari credentials.
393
387
394
388
### Link to the cluster
395
389
396
-
1.Enter a valid Ambari username.
397
-
2.Enter a valid password.
390
+
1.Enter a valid Ambari username.
391
+
2.Enter a valid password.
398
392
399
393

400
394
@@ -416,11 +410,11 @@ When you select the Azure HDInsight explorer to expand a Data Lake Storage Gen2
416
410
417
411
When you submit a job to an HDInsight cluster by using Data Lake Storage Gen2, you're prompted to enter the storage access key if your Azure account has no write access to Gen2 storage. After the access key is validated, the job will be successfully submitted.
418
412
419
-

413
+

420
414
421
415
> [!NOTE]
422
-
>
423
-
>You can get the access key for the storage account from the Azure portal. For more information, see [View and copy access keys](https://docs.microsoft.com/azure/storage/common/storage-account-manage#access-keys).
416
+
>
417
+
>You can get the access key for the storage account from the Azure portal. For more information, see [View and copy access keys](https://docs.microsoft.com/azure/storage/common/storage-account-manage#access-keys).
424
418
425
419
## Unlink cluster
426
420
@@ -434,6 +428,6 @@ When you submit a job to an HDInsight cluster by using Data Lake Storage Gen2, y
434
428
435
429
From the menu bar, go to **View**>**Command Palette**, and then enter **Azure: Sign Out**.
436
430
437
-
438
431
## Next steps
432
+
439
433
For a video that demonstrates using Spark & Hive for Visual Studio Code, see [Spark & Hive for Visual Studio Code](https://go.microsoft.com/fwlink/?linkid=858706).
0 commit comments