Skip to content

Commit 4596bc4

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into whatsnew-item
2 parents 5c87f4f + 3eff095 commit 4596bc4

25 files changed

+119
-51
lines changed

articles/automation/automation-windows-hrw-install.md

Lines changed: 25 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.service: automation
66
ms.subservice: process-automation
77
author: mgoedtel
88
ms.author: magoedte
9-
ms.date: 11/25/2019
9+
ms.date: 12/10/2019
1010
ms.topic: conceptual
1111
manager: carmonm
1212
---
@@ -16,7 +16,13 @@ You can use the Hybrid Runbook Worker feature of Azure Automation to run runbook
1616

1717
## Installing the Windows Hybrid Runbook Worker
1818

19-
To install and configure a Windows Hybrid Runbook Worker, you can use two methods. The recommended method is using an Automation runbook to completely automate the process of configuring a Windows computer. The second method is following a step-by-step procedure to manually install and configure the role.
19+
To install and configure a Windows Hybrid Runbook Worker, you can use one of the three following methods:
20+
21+
* For Azure VMs, you install the Log Analytics agent for Windows using the [virtual machine extension for Windows](../virtual-machines/extensions/oms-windows.md). The extension installs the Log Analytics agent on Azure virtual machines, and enrolls virtual machines into an existing Log Analytics workspace using an Azure Resource Manager template or with PowerShell. Once the agent is installed, the VM can be added to a Hybrid Runbook Worker group in your Automation account following **step 4** under the [Manual deployment](#manual-deployment) section below.
22+
23+
* Use an Automation runbook to completely automate the process of configuring a Windows computer. This is the recommended method for machines in your datacenter or other cloud environment.
24+
25+
* Follow a step-by-step procedure to manually install and configure the Hybrid Runbook Worker role on your non-Azure VM.
2026

2127
> [!NOTE]
2228
> To manage the configuration of your servers that support the Hybrid Runbook Worker role with Desired State Configuration (DSC), you need to add them as DSC nodes.
@@ -33,7 +39,7 @@ The minimum requirements for a Windows Hybrid Runbook Worker are:
3339
To get more networking requirements for the Hybrid Runbook Worker, see [Configuring your network](automation-hybrid-runbook-worker.md#network-planning).
3440

3541
For more information about onboarding servers for management with DSC, see [Onboarding machines for management by Azure Automation DSC](automation-dsc-onboarding.md).
36-
If you enable the [Update Management solution](../operations-management-suite/oms-solution-update-management.md), any Windows computer that's connected to your Azure Log Analytics workspace is automatically configured as a Hybrid Runbook Worker to support runbooks included in this solution. However, it isn't registered with any Hybrid Worker groups already defined in your Automation account.
42+
If you enable the [Update Management solution](../operations-management-suite/oms-solution-update-management.md), any Windows computer that's connected to your Log Analytics workspace is automatically configured as a Hybrid Runbook Worker to support runbooks included in this solution. However, it isn't registered with any Hybrid Worker groups already defined in your Automation account.
3743

3844
The computer can be added to a Hybrid Runbook Worker group in your Automation account to support Automation runbooks as long as you're using the same account for both the solution and the Hybrid Runbook Worker group membership. This functionality has been added to version 7.2.12024.0 of the Hybrid Runbook Worker.
3945

@@ -83,29 +89,37 @@ Perform the first two steps once for your Automation environment, and then repea
8389

8490
#### 1. Create a Log Analytics workspace
8591

86-
If you don't already have a Log Analytics workspace, create one by using the instructions at [Manage your workspace](../azure-monitor/platform/manage-access.md). You can use an existing workspace if you already have one.
92+
If you don't already have a Log Analytics workspace, first review the [Azure Monitor Log design guidance](../azure-monitor/platform/design-logs-deployment.md) before you create a workspace.
8793

8894
#### 2. Add the Automation solution to the Log Analytics workspace
8995

90-
The Automation Azure Monitor logs solution adds functionality for Azure Automation, including support for Hybrid Runbook Worker. When you add the solution to your workspace, it automatically pushes worker components to the agent computer that you will install in the next step.
96+
The Automation solution adds functionality for Azure Automation, including support for Hybrid Runbook Worker. When you add the solution to your Log Analytics workspace, it automatically pushes worker components to the agent computer that you will install in the next step.
9197

92-
To add the **Automation** Azure Monitor logs solution to your workspace, run the following PowerShell.
98+
To add the **Automation** solution to your workspace, run the following PowerShell.
9399

94100
```powershell-interactive
95101
Set-AzureRmOperationalInsightsIntelligencePack -ResourceGroupName <logAnalyticsResourceGroup> -WorkspaceName <LogAnalyticsWorkspaceName> -IntelligencePackName "AzureAutomation" -Enabled $true
96102
```
97103

98-
#### 3. Install the Microsoft Monitoring Agent
104+
#### 3. Install the Log Analytics agent for Windows
99105

100-
The Microsoft Monitoring Agent connects computers to Azure Monitor logs. When you install the agent on your on-premises computer and connect it to your workspace, it automatically downloads the components that are required for Hybrid Runbook Worker.
106+
The Log Analytics agent for Windows connects computers to an Azure Monitor Log Analytics workspace. When you install the agent on your computer and connect it to your workspace, it automatically downloads the components that are required for Hybrid Runbook Worker.
101107

102-
To install the agent on the on-premises computer, follow the instructions at [Connect Windows computers to Azure Monitor logs](../log-analytics/log-analytics-windows-agent.md). You can repeat this process for multiple computers to add multiple workers to your environment.
108+
To install the agent on the computer, follow the instructions at [Connect Windows computers to Azure Monitor logs](../log-analytics/log-analytics-windows-agent.md). You can repeat this process for multiple computers to add multiple workers to your environment.
109+
110+
When the agent has successfully connected to your Log Analytics workspace, after a few minutes you can run the following query to verify it is sending heartbeat data to the workspace:
111+
112+
```kusto
113+
Heartbeat
114+
| where Category == "Direct Agent"
115+
| where TimeGenerated > ago(30m)
116+
```
103117

104-
When the agent has successfully connected to Azure Monitor logs, it's listed on the **Connected Sources** tab of the log analytics **Settings** page. You can verify that the agent has correctly downloaded the Automation solution when it has a folder called **AzureAutomationFiles** in C:\Program Files\Microsoft Monitoring Agent\Agent. To confirm the version of the Hybrid Runbook Worker, you can browse to C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation\ and note the \\*version* subfolder.
118+
In the search results returned, you should see heartbeat records for the computer indicating it is connected and reporting to the service. The heartbeat record is forwarded from every agent by default to its assigned workspace. You can verify that the agent has correctly downloaded the Automation solution when it has a folder called **AzureAutomationFiles** in C:\Program Files\Microsoft Monitoring Agent\Agent. To confirm the version of the Hybrid Runbook Worker, you can browse to C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation\ and note the \\*version* subfolder.
105119

106120
#### 4. Install the runbook environment and connect to Azure Automation
107121

108-
When you add an agent to Azure Monitor logs, the Automation solution pushes down the **HybridRegistration** PowerShell module, which contains the **Add-HybridRunbookWorker** cmdlet. You use this cmdlet to install the runbook environment on the computer and register it with Azure Automation.
122+
When you configure an agent to report to a Log Analytics workspace, the Automation solution pushes down the **HybridRegistration** PowerShell module, which contains the **Add-HybridRunbookWorker** cmdlet. You use this cmdlet to install the runbook environment on the computer and register it with Azure Automation.
109123

110124
Open a PowerShell session in Administrator mode and run the following commands to import the module:
111125

articles/azure-monitor/platform/data-sources-syslog.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ The Log Analytics agent for Linux will only collect events with the facilities a
4444
### Configure Syslog in the Azure portal
4545
Configure Syslog from the [Data menu in Advanced Settings](agent-data-sources.md#configuring-data-sources). This configuration is delivered to the configuration file on each Linux agent.
4646

47-
You can add a new facility by typing in its name and clicking **+**. For each facility, only messages with the selected severities will be collected. Check the severities for the particular facility that you want to collect. You cannot provide any additional criteria to filter messages.
47+
You can add a new facility by first selecting the option **Apply below configuration to my machines** and then typing in its name and clicking **+**. For each facility, only messages with the selected severities will be collected. Check the severities for the particular facility that you want to collect. You cannot provide any additional criteria to filter messages.
4848

4949
![Configure Syslog](media/data-sources-syslog/configure.png)
5050

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -112,7 +112,7 @@ To use service principal authentication, follow these steps.
112112
- **As sink**: In Storage Explorer, grant at least **Execute** permission for ALL upstream folders and the file system, along with **Write** permission for the sink folder. Alternatively, in Access control (IAM), grant at least the **Storage Blob Data Contributor** role.
113113

114114
>[!NOTE]
115-
>If you use Data Factory UI to author and the service principal is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a file system or path with Execute permission to continue.
115+
>If you use Data Factory UI to author and the service principal is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a path with Read + Execute permission to continue. For now, test connection to a file system would fail, specify a sub-directory to test or skip this operation.
116116
117117
These properties are supported for the linked service:
118118

@@ -163,7 +163,7 @@ To use managed identities for Azure resource authentication, follow these steps.
163163
- **As sink**: In Storage Explorer, grant at least **Execute** permission for ALL upstream folders and the file system, along with **Write** permission for the sink folder. Alternatively, in Access control (IAM), grant at least the **Storage Blob Data Contributor** role.
164164

165165
>[!NOTE]
166-
>If you use Data Factory UI to author and the managed identity is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a file system or path with Execute permission to continue.
166+
>If you use Data Factory UI to author and the managed identity is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a path with Read + Execute permission to continue. For now, test connection to a file system would fail, specify a sub-directory to test or skip this operation.
167167
168168
>[!IMPORTANT]
169169
>If you use PolyBase to load data from Data Lake Storage Gen2 into SQL Data Warehouse, when using managed identity authentication for Data Lake Storage Gen2, make sure you also follow steps 1 and 2 in [this guidance](../sql-database/sql-database-vnet-service-endpoint-rule-overview.md#impact-of-using-vnet-service-endpoints-with-azure-storage) to 1) register your SQL Database server with Azure Active Directory (Azure AD) and 2) assign the Storage Blob Data Contributor role to your SQL Database server; the rest are handled by Data Factory. If your Data Lake Storage Gen2 is configured with an Azure Virtual Network endpoint, to use PolyBase to load data from it, you must use managed identity authentication as required by PolyBase.

articles/data-factory/copy-activity-preserve-metadata.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,12 +20,12 @@ When you use Azure Data Factory copy activity to copy data from source to sink,
2020

2121
## <a name="preserve-metadata"></a> Preserve metadata for lake migration
2222

23-
When you migrate data from one data lake to another, like Amazon S3, Azure Blob, and Azure Data Lake Storage Gen2, you can choose to preserve the file metadata along with data.
23+
When you migrate data from one data lake to another including [Amazon S3](connector-amazon-simple-storage-service.md), [Azure Blob](connector-azure-blob-storage.md), and [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md), you can choose to preserve the file metadata along with data.
2424

2525
Copy activity supports preserving the following attributes during data copy:
2626

2727
- **All the customer specified metadata**
28-
- And the following **five data store built-in system properties**: `contentType`, `contentLanguage`, `contentEncoding`, `contentDisposition`, `cacheControl`.
28+
- And the following **five data store built-in system properties**: `contentType`, `contentLanguage` (except for Amazon S3), `contentEncoding`, `contentDisposition`, `cacheControl`.
2929

3030
When you copy files as-is from Amazon S3/Azure Data Lake Storage Gen2/Azure Blob to Azure Data Lake Storage Gen2/Azure Blob with binary format, you can find the **Preserve** option on the **Copy Activity** > **Settings** tab for activity authoring or the **Settings** page in Copy Data Tool.
3131

articles/data-factory/load-sap-bw-data.md

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -151,11 +151,15 @@ On the data factory **Let's get started** page, select **Create pipeline from te
151151

152152
- **SAPOpenHubDestinationName**: Specify the Open Hub table name to copy data from.
153153

154-
- **ADLSGen2SinkPath**: Specify the destination Azure Data Lake Storage Gen2 path to copy data to. If the path doesn't exist, the Data Factory copy activity creates a path during execution.
154+
- **Data_Destination_Container**: Specify the destination Azure Data Lake Storage Gen2 container to copy data to. If the container doesn't exist, the Data Factory copy activity creates one during execution.
155+
156+
- **Data_Destination_Directory**: Specify the folder path under the Azure Data Lake Storage Gen2 container to copy data to. If the path doesn't exist, the Data Factory copy activity creates a path during execution.
157+
158+
- **HighWatermarkBlobContainer**: Specify the container to store the high-watermark value.
155159

156-
- **HighWatermarkBlobPath**: Specify the path to store the high-watermark value, such as `container/path`.
160+
- **HighWatermarkBlobDirectory**: Specify the folder path under the container to store the high-watermark value.
157161

158-
- **HighWatermarkBlobName**: Specify the blob name to store the high watermark value, such as `requestIdCache.txt`. In Blob storage, go to the corresponding path of HighWatermarkBlobPath+HighWatermarkBlobName, such as *container/path/requestIdCache.txt*. Create a blob with content 0.
162+
- **HighWatermarkBlobName**: Specify the blob name to store the high watermark value, such as `requestIdCache.txt`. In Blob storage, go to the corresponding path of HighWatermarkBlobContainer+HighWatermarkBlobDirectory+HighWatermarkBlobName, such as *container/path/requestIdCache.txt*. Create a blob with content 0.
159163

160164
![Blob content](media/load-sap-bw-data/blob.png)
161165

@@ -180,11 +184,11 @@ On the data factory **Let's get started** page, select **Create pipeline from te
180184
}
181185
```
182186

183-
3. Add a **Create blob** action. For **Folder path** and **Blob name**, use the same values that you configured previously in **HighWatermarkBlobPath** and **HighWatermarkBlobName**.
187+
3. Add a **Create blob** action. For **Folder path** and **Blob name**, use the same values that you configured previously in *HighWatermarkBlobContainer+HighWatermarkBlobDirectory* and *HighWatermarkBlobName*.
184188

185189
4. Select **Save**. Then, copy the value of **HTTP POST URL** to use in the Data Factory pipeline.
186190

187-
4. After you provide the Data Factory pipeline parameters, select **Debug** > **Finish** to invoke a run to validate the configuration. Or, select **Publish All** to publish the changes, and then select **Trigger** to execute a run.
191+
4. After you provide the Data Factory pipeline parameters, select **Debug** > **Finish** to invoke a run to validate the configuration. Or, select **Publish** to publish all the changes, and then select **Add trigger** to execute a run.
188192

189193
## SAP BW Open Hub Destination configurations
190194

2.1 KB
Loading
55.1 KB
Loading
-3.31 KB
Loading

articles/dev-spaces/about.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: "Introduction to Azure Dev Spaces"
33
services: azure-dev-spaces
44
ms.date: 05/07/2019
55
ms.topic: "overview"
6-
description: "Introduction to Azure Dev Spaces"
6+
description: "Learn how Azure Dev Spaces provides a rapid, iterative Kubernetes development experience for teams in Azure Kubernetes Service clusters"
77
keywords: "Docker, Kubernetes, Azure, AKS, Azure Kubernetes Service, containers, kubectl, k8s"
88
manager: gwallace
99
---

articles/dev-spaces/get-started-java.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: stepro
55
ms.author: stephpr
66
ms.date: 09/26/2018
77
ms.topic: tutorial
8-
description: "Rapid Kubernetes development with containers and microservices on Azure"
8+
description: "This tutorial shows you how to use Azure Dev Spaces and Visual Studio Code to debug and rapidly iterate a Java application on Azure Kubernetes Service"
99
keywords: "Docker, Kubernetes, Azure, AKS, Azure Kubernetes Service, containers, Helm, service mesh, service mesh routing, kubectl, k8s"
1010
manager: gwallace
1111
---

0 commit comments

Comments
 (0)