Skip to content

Commit 9a9b443

Browse files
author
dksimpson
committed
Merge branch 'release-functions-ux-update' of https://github.com/MicrosoftDocs/azure-docs-pr into release-functions-ux-update-dks-14
2 parents 0d8cca9 + c792fb4 commit 9a9b443

11 files changed

+58
-97
lines changed

articles/governance/policy/how-to/guest-configuration-create.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ To learn about creating Guest Configuration policies for Linux, see the page
1414

1515
When auditing Windows, Guest Configuration uses a
1616
[Desired State Configuration](/powershell/scripting/dsc/overview/overview) (DSC) resource module to
17-
and configuration file. The DSC configuration defines the condition that the machine should be in.
17+
create the configuration file. The DSC configuration defines the condition that the machine should be in.
1818
If the evaluation of the configuration fails, the policy effect **auditIfNotExists** is triggered
1919
and the machine is considered **non-compliant**.
2020

articles/hdinsight/hadoop/apache-hadoop-using-apache-hive-as-an-etl-tool.md

Lines changed: 15 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,35 +1,35 @@
11
---
22
title: Using Apache Hive as an ETL Tool - Azure HDInsight
33
description: Use Apache Hive to extract, transform, and load (ETL) data in Azure HDInsight.
4-
ms.service: hdinsight
54
author: ashishthaps
65
ms.author: ashishth
76
ms.reviewer: jasonh
8-
ms.custom: hdinsightactive
7+
ms.service: hdinsight
98
ms.topic: conceptual
10-
ms.date: 11/22/2019
9+
ms.custom: hdinsightactive,seoapr2020
10+
ms.date: 04/28/2020
1111
---
1212

1313
# Use Apache Hive as an Extract, Transform, and Load (ETL) tool
1414

15-
You typically need to clean and transform incoming data before loading it into a destination suitable for analytics. Extract, Transform, and Load (ETL) operations are used to prepare data and load it into a data destination. Apache Hive on HDInsight can read in unstructured data, process the data as needed, and then load the data into a relational data warehouse for decision support systems. In this approach, data is extracted from the source and stored in scalable storage, such as Azure Storage blobs or Azure Data Lake Storage. The data is then transformed using a sequence of Hive queries and is finally staged within Hive in preparation for bulk loading into the destination data store.
15+
You typically need to clean and transform incoming data before loading it into a destination suitable for analytics. Extract, Transform, and Load (ETL) operations are used to prepare data and load it into a data destination. Apache Hive on HDInsight can read in unstructured data, process the data as needed, and then load the data into a relational data warehouse for decision support systems. In this approach, data is extracted from the source. Then stored in adaptable storage, such as Azure Storage blobs or Azure Data Lake Storage. The data is then transformed using a sequence of Hive queries. Then staged within Hive in preparation for bulk loading into the destination data store.
1616

1717
## Use case and model overview
1818

19-
The following figure shows an overview of the use case and model for ETL automation. Input data is transformed to generate the appropriate output. During that transformation, the data can change shape, data type, and even language. ETL processes can convert Imperial to metric, change time zones, and improve precision to properly align with existing data in the destination. ETL processes can also combine new data with existing data to keep reporting up to date, or to provide further insight into existing data. Applications such as reporting tools and services can then consume this data in the desired format.
19+
The following figure shows an overview of the use case and model for ETL automation. Input data is transformed to generate the appropriate output. During that transformation, the data changes shape, data type, and even language. ETL processes can convert Imperial to metric, change time zones, and improve precision to properly align with existing data in the destination. ETL processes can also combine new data with existing data to keep reporting up to date, or to provide further insight into existing data. Applications such as reporting tools and services can then consume this data in the wanted format.
2020

2121
![Apache Hive as ETL architecture](./media/apache-hadoop-using-apache-hive-as-an-etl-tool/hdinsight-etl-architecture.png)
2222

23-
Hadoop is typically used in ETL processes that import either a massive number of text files (like CSVs) or a smaller but frequently changing number of text files, or both. Hive is a great tool to use to prepare the data before loading it into the data destination. Hive allows you to create a schema over the CSV and use a SQL-like language to generate MapReduce programs that interact with the data.
23+
Hadoop is typically used in ETL processes that import either a massive number of text files (like CSVs). Or a smaller but frequently changing number of text files, or both. Hive is a great tool to use to prepare the data before loading it into the data destination. Hive allows you to create a schema over the CSV and use a SQL-like language to generate MapReduce programs that interact with the data.
2424

25-
The typical steps to using Hive to perform ETL are as follows:
25+
The typical steps to using Hive to do ETL are as follows:
2626

2727
1. Load data into Azure Data Lake Storage or Azure Blob Storage.
2828
2. Create a Metadata Store database (using Azure SQL Database) for use by Hive in storing your schemas.
2929
3. Create an HDInsight cluster and connect the data store.
3030
4. Define the schema to apply at read-time over data in the data store:
3131

32-
```
32+
```hql
3333
DROP TABLE IF EXISTS hvac;
3434
3535
--create the hvac table on comma-separated sensor data stored in Azure Storage blobs
@@ -61,30 +61,28 @@ Data sources are typically external data that can be matched to existing data in
6161
6262
## Output targets
6363
64-
You can use Hive to output data to a variety of targets including:
64+
You can use Hive to output data to different kinds of targets including:
6565
6666
* A relational database, such as SQL Server or Azure SQL Database.
6767
* A data warehouse, such as Azure SQL Data Warehouse.
6868
* Excel.
6969
* Azure table and blob storage.
7070
* Applications or services that require data to be processed into specific formats, or as files that contain specific types of information structure.
71-
* A JSON Document Store like [Azure Cosmos DB](https://azure.microsoft.com/services/cosmos-db/).
71+
* A JSON Document Store like Azure Cosmos DB.
7272
7373
## Considerations
7474
7575
The ETL model is typically used when you want to:
7676
77-
* Load stream data or large volumes of semi-structured or unstructured data from external sources into an existing database or information system.
78-
* Clean, transform, and validate the data before loading it, perhaps by using more than one transformation pass through the cluster.
79-
* Generate reports and visualizations that are regularly updated. For example, if the report takes too long to generate during the day, you can schedule the report to run at night. To automatically run a Hive query, you can use [Azure Logic Apps](../../logic-apps/logic-apps-overview.md) and PowerShell.
77+
`*` Load stream data or large volumes of semi-structured or unstructured data from external sources into an existing database or information system.
78+
`*` Clean, transform, and validate the data before loading it, perhaps by using more than one transformation pass through the cluster.
79+
`*` Generate reports and visualizations that are regularly updated. For example, if the report takes too long to generate during the day, you can schedule the report to run at night. To automatically run a Hive query, you can use [Azure Logic Apps](../../logic-apps/logic-apps-overview.md) and PowerShell.
8080
8181
If the target for the data isn't a database, you can generate a file in the appropriate format within the query, for example a CSV. This file can then be imported into Excel or Power BI.
8282
83-
If you need to execute several operations on the data as part of the ETL process, consider how you manage them. If the operations are controlled by an external program, rather than as a workflow within the solution, you need to decide whether some operations can be executed in parallel, and to detect when each job completes. Using a workflow mechanism such as Oozie within Hadoop may be easier than trying to orchestrate a sequence of operations using external scripts or custom programs. For more information about Oozie, see [Workflow and job orchestration](https://msdn.microsoft.com/library/dn749829.aspx).
83+
If you need to execute several operations on the data as part of the ETL process, consider how you manage them. With operations controlled by an external program, rather than as a workflow within the solution, decide whether some operations can be executed in parallel. And to detect when each job completes. Using a workflow mechanism such as Oozie within Hadoop may be easier than trying to orchestrate a sequence of operations using external scripts or custom programs.
8484
8585
## Next steps
8686
8787
* [ETL at scale](apache-hadoop-etl-at-scale.md)
88-
* [Operationalize a data pipeline](../hdinsight-operationalize-data-pipeline.md)
89-
90-
<!-- * [ETL Deep Dive](../hdinsight-etl-deep-dive.md) -->
88+
* [`Operationalize a data pipeline`](../hdinsight-operationalize-data-pipeline.md)

articles/mysql/howto-alert-on-metric.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Configure metric alerts - Azure portal - Azure Database for MySQL
33
description: This article describes how to configure and access metric alerts for Azure Database for MySQL from the Azure portal.
4-
author: rachel-msft
5-
ms.author: raagyema
4+
author: ajlam
5+
ms.author: andrela
66
ms.service: mysql
77
ms.topic: conceptual
88
ms.date: 3/18/2020

articles/mysql/howto-database-threat-protection-portal.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Advanced Threat Protection - Azure portal - Azure Database for MySQL
33
description: Learn how to configure Advanced Threat Protection to detect anomalous database activities indicating potential security threats to the database.
4-
author: bolzmj
5-
ms.author: mbolz
4+
author: ajlam
5+
ms.author: andrela
66
ms.service: mysql
77
ms.topic: conceptual
88
ms.date: 3/18/2020

articles/mysql/howto-manage-vnet-using-cli.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
11
---
22
title: Manage VNet endpoints - Azure CLI - Azure Database for MySQL
33
description: This article describes how to create and manage Azure Database for MySQL VNet service endpoints and rules using Azure CLI command line.
4-
author: bolzmj
5-
ms.author: mbolz
6-
manager: jhubbard
4+
author: kummanish
5+
ms.author: manishku
76
ms.service: mysql
87
ms.devlang: azurecli
98
ms.topic: conceptual

articles/mysql/howto-manage-vnet-using-portal.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Manage VNet endpoints - Azure portal - Azure Database for MySQL
33
description: Create and manage Azure Database for MySQL VNet service endpoints and rules using the Azure portal
4-
author: bolzmj
5-
ms.author: mbolz
4+
author: kummanish
5+
ms.author: manishku
66
ms.service: mysql
77
ms.topic: conceptual
88
ms.date: 3/18/2020

articles/postgresql/howto-manage-vnet-using-cli.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Use virtual network rules - Azure CLI - Azure Database for PostgreSQL - Single Server
33
description: This article describes how to create and manage VNet service endpoints and rules for Azure Database for PostgreSQL using Azure CLI command line.
4-
author: bolzmj
5-
ms.author: mbolz
4+
author: rachel-msft
5+
ms.author: raagyema
66
ms.service: postgresql
77
ms.devlang: azurecli
88
ms.topic: conceptual

articles/postgresql/howto-manage-vnet-using-portal.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Use virtual network rules - Azure portal - Azure Database for PostgreSQL - Single Server
33
description: Create and manage VNet service endpoints and rules Azure Database for PostgreSQL - Single Server using the Azure portal
4-
author: bolzmj
5-
ms.author: mbolz
4+
author: rachel-msft
5+
ms.author: raagyema
66
ms.service: postgresql
77
ms.topic: conceptual
88
ms.date: 5/6/2019
Lines changed: 26 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,16 @@
11
---
2-
title: Set up Azure multi-factor authentication for Windows Virtual Desktop - Azure
3-
description: How to set up Azure multi-factor authentication for increased security in Windows Virtual Desktop.
2+
title: Set up Azure Multi-Factor Authentication for Windows Virtual Desktop - Azure
3+
description: How to set up Azure Multi-Factor Authentication for increased security in Windows Virtual Desktop.
44
services: virtual-desktop
55
author: Heidilohr
66

77
ms.service: virtual-desktop
88
ms.topic: conceptual
9-
ms.date: 04/01/2020
9+
ms.date: 04/22/2020
1010
ms.author: helohr
1111
manager: lizross
1212
---
13-
14-
# Set up Azure Multi-Factor Authentication
13+
# Enable Azure Multi-Factor Authentication for Windows Virtual Desktop
1514

1615
The Windows client for Windows Virtual Desktop is an excellent option for integrating Windows Virtual Desktop with your local machine. However, when you configure your Windows Virtual Desktop account into the Windows Client, there are certain measures you'll need to take to keep yourself and your users safe.
1716

@@ -23,71 +22,34 @@ While remembering credentials is convenient, it can also make deployments on Ent
2322

2423
Here's what you'll need to get started:
2524

26-
- Assign all your users one of the following licenses:
27-
- Microsoft 365 E3 or E5
28-
- Azure Active Directory Premium P1 or P2
29-
- Enterprise Mobility + Security E3 or E5
25+
- Assign users a license that includes Azure Active Directory Premium P1 or P2.
3026
- An Azure Active Directory group with your users assigned as group members.
3127
- Enable Azure MFA for all your users. For more information about how to do that, see [How to require two-step verification for a user](../active-directory/authentication/howto-mfa-userstates.md#view-the-status-for-a-user).
3228

33-
>[!NOTE]
34-
>The following setting also applies to the [Windows Virtual Desktop web client](https://rdweb.wvd.microsoft.com/webclient/index.html).
35-
36-
## Opt in to the Conditional Access policy
37-
38-
1. Open **Azure Active Directory**.
39-
40-
2. Go to the **All applications** tab. In the "Application type" drop-down menu, select **Enterprise Applications**, then search for **Windows Virtual Desktop Client**.
41-
42-
![A screenshot of the All applications tab. The user entered "windows virtual desktop client" into the search bar, and the app has shown up in the search results.](media/all-applications-search.png)
43-
44-
3. Select **Conditional Access**.
45-
46-
![A screenshot showing the user hovering their mouse cursor over the Conditional Access tab.](media/conditional-access-location.png)
47-
48-
4. Select **+ New policy**.
49-
50-
![A screenshot of the Conditional Access page. The user is hovering their mouse cursor over the new policy button.](media/new-policy-button.png)
51-
52-
5. Enter a **name** for the **rule**, then **select** the *name of the **group** you created in the prerequisites.
53-
54-
6. Select **Select**, then select **Done**.
55-
56-
7. Next, open **Cloud Apps or actions**.
57-
58-
8. On the **Select** panel, select the **Windows Virtual Desktop** Enterprise app.
59-
60-
![A screenshot of the Cloud apps or actions page. The user has selected the Windows Virtual Desktop app by selecting the check mark next to it. The selected app is highlighted in red.](media/cloud-apps-select.png)
61-
62-
>[!NOTE]
63-
>You should also see the Windows Virtual Desktop Client app selected on the left side of the screen, as shown in the following image. You need both the Windows Virtual Desktop and Windows Virtual Desktop Client Enterprise apps for the policy to work.
64-
>
65-
> ![A screenshot of the Cloud apps or actions page. The Windows Virtual Desktop and Windows Virtual Desktop Client apps are highlighted in red.](media/cloud-apps-enterprise-selected.png)
66-
67-
9. Select **Select**
68-
69-
10. Next, open **Grant**
70-
71-
11. Select **Require multi-factor authentication**, then select **Require one of the selected controls**.
72-
73-
![A screenshot of the Grant page. "Require multi-factor authentication" is selected.](media/grant-page.png)
74-
75-
>[!NOTE]
76-
>If you have MDM-enrolled devices in your organization and don't want them to show the MFA prompt, you can also select **Require device to be marked as compliant**.
29+
> [!NOTE]
30+
> The following setting also applies to the [Windows Virtual Desktop web client](https://rdweb.wvd.microsoft.com/webclient/index.html).
7731
78-
12. Select **Session**.
32+
## Create a Conditional Access policy
7933

80-
13. Set the **Sign-in frequency** to **Active**, then change its value to **1 Hours**.
34+
This section will show you how to create a Conditional Access policy that requires multi-factor authentication when connecting to Windows Virtual Desktop.
8135

82-
![A screenshot of the Session page. The session menu shows the sign-in frequency drop-down menus have been changed to "1" and "Hours."](media/sign-in-frequency.png)
83-
84-
>[!NOTE]
85-
>Active sessions in your Windows Virtual Desktop environment will continue to work as you change the policy. However, if you disconnect or sign off, you'll need to provide your credentials again after 60 minutes. As you change the settings, you can extend the timeout period as much as you want (as long as it aligns with your organization's security policy).
86-
>
87-
>The default setting is a rolling window of 90 days, which means the client will ask users to sign in again when they try to access a resource after being inactive on their machine for 90 days or longer.
36+
1. Sign in to the **Azure portal** as a global administrator, security administrator, or Conditional Access administrator.
37+
1. Browse to **Azure Active Directory** > **Security** > **Conditional Access**.
38+
1. Select **New policy**.
39+
1. Give your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.
40+
1. Under **Assignments**, select **Users and groups**.
41+
1. Under **Include**, select **Select users and groups** > **Users and groups** > Choose the group created in the prerequisites stage.
42+
1. Select **Done**.
43+
1. Under **Cloud apps or actions** > **Include**, select **Select apps**.
44+
1. Choose **Windows Virtual Desktop** and **Windows Virtual Desktop Client**, and select **Select** then **Done**.
45+
![A screenshot of the Cloud apps or actions page. The Windows Virtual Desktop and Windows Virtual Desktop Client apps are highlighted in red.](media/cloud-apps-enterprise-selected.png)
46+
1. Under **Access controls** > **Grant**, select **Grant access**, **Require multi-factor authentication**, and then **Select**.
47+
1. Under **Access controls** > **Session**, select **Sign-in frequency**, set the value to **1** and the unit to **Hours**, and then **Select**.
48+
1. Confirm your settings and set **Enable policy** to **On**.
49+
1. Select **Create** to enable your policy.
8850

89-
14. Enable the policy.
51+
## Next steps
9052

91-
15. Select **Create** to confirm the policy.
53+
- [Learn more about Conditional Access policies](../active-directory/conditional-access/concept-conditional-access-policies.md)
9254

93-
You're all done! Feel free to test the policy to make sure your allow list works as intended.
55+
- [Learn more about user sign in frequency](../active-directory/conditional-access/howto-conditional-access-session-lifetime.md#user-sign-in-frequency)

articles/virtual-machines/sizes-gpu.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,12 +31,14 @@ GPU optimized VM sizes are specialized virtual machines available with single or
3131

3232
## Supported operating systems and drivers
3333

34-
To take advantage of the GPU capabilities of Azure N-series VMs, NVIDIA GPU drivers must be installed.
34+
To take advantage of the GPU capabilities of Azure N-series VMs, NVIDIA or AMD GPU drivers must be installed.
3535

3636
The [NVIDIA GPU Driver Extension](/azure/virtual-machines/extensions/hpccompute-gpu-windows) installs appropriate NVIDIA CUDA or GRID drivers on an N-series VM. Install or manage the extension using the Azure portal or tools such as Azure PowerShell or Azure Resource Manager templates. See the [NVIDIA GPU Driver Extension documentation](/azure/virtual-machines/extensions/hpccompute-gpu-windows) for supported operating systems and deployment steps. For general information about VM extensions, see [Azure virtual machine extensions and features](/azure/virtual-machines/extensions/overview).
3737

3838
If you choose to install NVIDIA GPU drivers manually, see [N-series GPU driver setup for Windows](/azure/virtual-machines/windows/n-series-driver-setup) or [N-series GPU driver setup for Linux](/azure/virtual-machines/linux/n-series-driver-setup) for supported operating systems, drivers, installation, and verification steps.
3939

40+
To manually install the AMD GPU drivers, see [N-series AMD GPU driver setup for Windows](/azure/virtual-machines/windows/n-series-amd-driver-setup) for supported operating systems, drivers, installation, and verification steps.
41+
4042
## Deployment considerations
4143

4244
- For availability of N-series VMs, see [Products available by region](https://azure.microsoft.com/regions/services/).

0 commit comments

Comments
 (0)