You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/automation/automation-windows-hrw-install.md
+25-11Lines changed: 25 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.service: automation
6
6
ms.subservice: process-automation
7
7
author: mgoedtel
8
8
ms.author: magoedte
9
-
ms.date: 11/25/2019
9
+
ms.date: 12/10/2019
10
10
ms.topic: conceptual
11
11
manager: carmonm
12
12
---
@@ -16,7 +16,13 @@ You can use the Hybrid Runbook Worker feature of Azure Automation to run runbook
16
16
17
17
## Installing the Windows Hybrid Runbook Worker
18
18
19
-
To install and configure a Windows Hybrid Runbook Worker, you can use two methods. The recommended method is using an Automation runbook to completely automate the process of configuring a Windows computer. The second method is following a step-by-step procedure to manually install and configure the role.
19
+
To install and configure a Windows Hybrid Runbook Worker, you can use one of the three following methods:
20
+
21
+
* For Azure VMs, you install the Log Analytics agent for Windows using the [virtual machine extension for Windows](../virtual-machines/extensions/oms-windows.md). The extension installs the Log Analytics agent on Azure virtual machines, and enrolls virtual machines into an existing Log Analytics workspace using an Azure Resource Manager template or with PowerShell. Once the agent is installed, the VM can be added to a Hybrid Runbook Worker group in your Automation account following **step 4** under the [Manual deployment](#manual-deployment) section below.
22
+
23
+
* Use an Automation runbook to completely automate the process of configuring a Windows computer. This is the recommended method for machines in your datacenter or other cloud environment.
24
+
25
+
* Follow a step-by-step procedure to manually install and configure the Hybrid Runbook Worker role on your non-Azure VM.
20
26
21
27
> [!NOTE]
22
28
> To manage the configuration of your servers that support the Hybrid Runbook Worker role with Desired State Configuration (DSC), you need to add them as DSC nodes.
@@ -33,7 +39,7 @@ The minimum requirements for a Windows Hybrid Runbook Worker are:
33
39
To get more networking requirements for the Hybrid Runbook Worker, see [Configuring your network](automation-hybrid-runbook-worker.md#network-planning).
34
40
35
41
For more information about onboarding servers for management with DSC, see [Onboarding machines for management by Azure Automation DSC](automation-dsc-onboarding.md).
36
-
If you enable the [Update Management solution](../operations-management-suite/oms-solution-update-management.md), any Windows computer that's connected to your Azure Log Analytics workspace is automatically configured as a Hybrid Runbook Worker to support runbooks included in this solution. However, it isn't registered with any Hybrid Worker groups already defined in your Automation account.
42
+
If you enable the [Update Management solution](../operations-management-suite/oms-solution-update-management.md), any Windows computer that's connected to your Log Analytics workspace is automatically configured as a Hybrid Runbook Worker to support runbooks included in this solution. However, it isn't registered with any Hybrid Worker groups already defined in your Automation account.
37
43
38
44
The computer can be added to a Hybrid Runbook Worker group in your Automation account to support Automation runbooks as long as you're using the same account for both the solution and the Hybrid Runbook Worker group membership. This functionality has been added to version 7.2.12024.0 of the Hybrid Runbook Worker.
39
45
@@ -83,29 +89,37 @@ Perform the first two steps once for your Automation environment, and then repea
83
89
84
90
#### 1. Create a Log Analytics workspace
85
91
86
-
If you don't already have a Log Analytics workspace, create one by using the instructions at [Manage your workspace](../azure-monitor/platform/manage-access.md). You can use an existing workspace if you already have one.
92
+
If you don't already have a Log Analytics workspace, first review the [Azure Monitor Log design guidance](../azure-monitor/platform/design-logs-deployment.md) before you create a workspace.
87
93
88
94
#### 2. Add the Automation solution to the Log Analytics workspace
89
95
90
-
The Automation Azure Monitor logs solution adds functionality for Azure Automation, including support for Hybrid Runbook Worker. When you add the solution to your workspace, it automatically pushes worker components to the agent computer that you will install in the next step.
96
+
The Automation solution adds functionality for Azure Automation, including support for Hybrid Runbook Worker. When you add the solution to your Log Analytics workspace, it automatically pushes worker components to the agent computer that you will install in the next step.
91
97
92
-
To add the **Automation**Azure Monitor logs solution to your workspace, run the following PowerShell.
98
+
To add the **Automation** solution to your workspace, run the following PowerShell.
#### 3. Install the Log Analytics agent for Windows
99
105
100
-
The Microsoft Monitoring Agent connects computers to Azure Monitor logs. When you install the agent on your on-premises computer and connect it to your workspace, it automatically downloads the components that are required for Hybrid Runbook Worker.
106
+
The Log Analytics agent for Windows connects computers to an Azure Monitor Log Analytics workspace. When you install the agent on your computer and connect it to your workspace, it automatically downloads the components that are required for Hybrid Runbook Worker.
101
107
102
-
To install the agent on the on-premises computer, follow the instructions at [Connect Windows computers to Azure Monitor logs](../log-analytics/log-analytics-windows-agent.md). You can repeat this process for multiple computers to add multiple workers to your environment.
108
+
To install the agent on the computer, follow the instructions at [Connect Windows computers to Azure Monitor logs](../log-analytics/log-analytics-windows-agent.md). You can repeat this process for multiple computers to add multiple workers to your environment.
109
+
110
+
When the agent has successfully connected to your Log Analytics workspace, after a few minutes you can run the following query to verify it is sending heartbeat data to the workspace:
111
+
112
+
```kusto
113
+
Heartbeat
114
+
| where Category == "Direct Agent"
115
+
| where TimeGenerated > ago(30m)
116
+
```
103
117
104
-
When the agent has successfully connected to Azure Monitor logs, it's listed on the **Connected Sources** tab of the log analytics **Settings** page. You can verify that the agent has correctly downloaded the Automation solution when it has a folder called **AzureAutomationFiles** in C:\Program Files\Microsoft Monitoring Agent\Agent. To confirm the version of the Hybrid Runbook Worker, you can browse to C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation\ and note the \\*version* subfolder.
118
+
In the search results returned, you should see heartbeat records for the computer indicating it is connected and reporting to the service. The heartbeat record is forwarded from every agent by default to its assigned workspace. You can verify that the agent has correctly downloaded the Automation solution when it has a folder called **AzureAutomationFiles** in C:\Program Files\Microsoft Monitoring Agent\Agent. To confirm the version of the Hybrid Runbook Worker, you can browse to C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation\ and note the \\*version* subfolder.
105
119
106
120
#### 4. Install the runbook environment and connect to Azure Automation
107
121
108
-
When you add an agent to Azure Monitor logs, the Automation solution pushes down the **HybridRegistration** PowerShell module, which contains the **Add-HybridRunbookWorker** cmdlet. You use this cmdlet to install the runbook environment on the computer and register it with Azure Automation.
122
+
When you configure an agent to report to a Log Analytics workspace, the Automation solution pushes down the **HybridRegistration** PowerShell module, which contains the **Add-HybridRunbookWorker** cmdlet. You use this cmdlet to install the runbook environment on the computer and register it with Azure Automation.
109
123
110
124
Open a PowerShell session in Administrator mode and run the following commands to import the module:
Copy file name to clipboardExpand all lines: articles/azure-monitor/platform/data-sources-syslog.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -44,7 +44,7 @@ The Log Analytics agent for Linux will only collect events with the facilities a
44
44
### Configure Syslog in the Azure portal
45
45
Configure Syslog from the [Data menu in Advanced Settings](agent-data-sources.md#configuring-data-sources). This configuration is delivered to the configuration file on each Linux agent.
46
46
47
-
You can add a new facility by typing in its name and clicking **+**. For each facility, only messages with the selected severities will be collected. Check the severities for the particular facility that you want to collect. You cannot provide any additional criteria to filter messages.
47
+
You can add a new facility by first selecting the option **Apply below configuration to my machines** and then typing in its name and clicking **+**. For each facility, only messages with the selected severities will be collected. Check the severities for the particular facility that you want to collect. You cannot provide any additional criteria to filter messages.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-data-lake-storage.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -112,7 +112,7 @@ To use service principal authentication, follow these steps.
112
112
-**As sink**: In Storage Explorer, grant at least **Execute** permission for ALL upstream folders and the file system, along with **Write** permission for the sink folder. Alternatively, in Access control (IAM), grant at least the **Storage Blob Data Contributor** role.
113
113
114
114
>[!NOTE]
115
-
>If you use Data Factory UI to author and the service principal is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a file system or path with Execute permission to continue.
115
+
>If you use Data Factory UI to author and the service principal is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a path with Read + Execute permission to continue. For now, test connection to a file system would fail, specify a sub-directory to test or skip this operation.
116
116
117
117
These properties are supported for the linked service:
118
118
@@ -163,7 +163,7 @@ To use managed identities for Azure resource authentication, follow these steps.
163
163
-**As sink**: In Storage Explorer, grant at least **Execute** permission for ALL upstream folders and the file system, along with **Write** permission for the sink folder. Alternatively, in Access control (IAM), grant at least the **Storage Blob Data Contributor** role.
164
164
165
165
>[!NOTE]
166
-
>If you use Data Factory UI to author and the managed identity is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a file system or path with Execute permission to continue.
166
+
>If you use Data Factory UI to author and the managed identity is not set with "Storage Blob Data Reader/Contributor" role in IAM, when doing test connection or browsing/navigating folders, choose "Test connection to file path" or "Browse from specified path", and specify a path with Read + Execute permission to continue. For now, test connection to a file system would fail, specify a sub-directory to test or skip this operation.
167
167
168
168
>[!IMPORTANT]
169
169
>If you use PolyBase to load data from Data Lake Storage Gen2 into SQL Data Warehouse, when using managed identity authentication for Data Lake Storage Gen2, make sure you also follow steps 1 and 2 in [this guidance](../sql-database/sql-database-vnet-service-endpoint-rule-overview.md#impact-of-using-vnet-service-endpoints-with-azure-storage) to 1) register your SQL Database server with Azure Active Directory (Azure AD) and 2) assign the Storage Blob Data Contributor role to your SQL Database server; the rest are handled by Data Factory. If your Data Lake Storage Gen2 is configured with an Azure Virtual Network endpoint, to use PolyBase to load data from it, you must use managed identity authentication as required by PolyBase.
Copy file name to clipboardExpand all lines: articles/data-factory/copy-activity-preserve-metadata.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,12 +20,12 @@ When you use Azure Data Factory copy activity to copy data from source to sink,
20
20
21
21
## <aname="preserve-metadata"></a> Preserve metadata for lake migration
22
22
23
-
When you migrate data from one data lake to another, like Amazon S3, Azure Blob, and Azure Data Lake Storage Gen2, you can choose to preserve the file metadata along with data.
23
+
When you migrate data from one data lake to another including [Amazon S3](connector-amazon-simple-storage-service.md), [Azure Blob](connector-azure-blob-storage.md), and [Azure Data Lake Storage Gen2](connector-azure-data-lake-storage.md), you can choose to preserve the file metadata along with data.
24
24
25
25
Copy activity supports preserving the following attributes during data copy:
26
26
27
27
-**All the customer specified metadata**
28
-
- And the following **five data store built-in system properties**: `contentType`, `contentLanguage`, `contentEncoding`, `contentDisposition`, `cacheControl`.
28
+
- And the following **five data store built-in system properties**: `contentType`, `contentLanguage` (except for Amazon S3), `contentEncoding`, `contentDisposition`, `cacheControl`.
29
29
30
30
When you copy files as-is from Amazon S3/Azure Data Lake Storage Gen2/Azure Blob to Azure Data Lake Storage Gen2/Azure Blob with binary format, you can find the **Preserve** option on the **Copy Activity** > **Settings** tab for activity authoring or the **Settings** page in Copy Data Tool.
Copy file name to clipboardExpand all lines: articles/data-factory/load-sap-bw-data.md
+9-5Lines changed: 9 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -151,11 +151,15 @@ On the data factory **Let's get started** page, select **Create pipeline from te
151
151
152
152
-**SAPOpenHubDestinationName**: Specify the Open Hub table name to copy data from.
153
153
154
-
-**ADLSGen2SinkPath**: Specify the destination Azure Data Lake Storage Gen2 path to copy data to. If the path doesn't exist, the Data Factory copy activity creates a path during execution.
154
+
-**Data_Destination_Container**: Specify the destination Azure Data Lake Storage Gen2 container to copy data to. If the container doesn't exist, the Data Factory copy activity creates one during execution.
155
+
156
+
-**Data_Destination_Directory**: Specify the folder path under the Azure Data Lake Storage Gen2 container to copy data to. If the path doesn't exist, the Data Factory copy activity creates a path during execution.
157
+
158
+
-**HighWatermarkBlobContainer**: Specify the container to store the high-watermark value.
155
159
156
-
-**HighWatermarkBlobPath**: Specify the path to store the high-watermark value, such as `container/path`.
160
+
-**HighWatermarkBlobDirectory**: Specify the folder path under the container to store the high-watermark value.
157
161
158
-
-**HighWatermarkBlobName**: Specify the blob name to store the high watermark value, such as `requestIdCache.txt`. In Blob storage, go to the corresponding path of HighWatermarkBlobPath+HighWatermarkBlobName, such as *container/path/requestIdCache.txt*. Create a blob with content 0.
162
+
-**HighWatermarkBlobName**: Specify the blob name to store the high watermark value, such as `requestIdCache.txt`. In Blob storage, go to the corresponding path of HighWatermarkBlobContainer+HighWatermarkBlobDirectory+HighWatermarkBlobName, such as *container/path/requestIdCache.txt*. Create a blob with content 0.
159
163
160
164

161
165
@@ -180,11 +184,11 @@ On the data factory **Let's get started** page, select **Create pipeline from te
180
184
}
181
185
```
182
186
183
-
3. Add a **Create blob** action. For **Folder path** and **Blob name**, use the same values that you configured previously in **HighWatermarkBlobPath** and **HighWatermarkBlobName**.
187
+
3. Add a **Create blob** action. For **Folder path** and **Blob name**, use the same values that you configured previously in *HighWatermarkBlobContainer+HighWatermarkBlobDirectory* and *HighWatermarkBlobName*.
184
188
185
189
4. Select **Save**. Then, copy the value of **HTTP POST URL** to use in the Data Factory pipeline.
186
190
187
-
4. After you provide the Data Factory pipeline parameters, select **Debug** > **Finish** to invoke a run to validate the configuration. Or, select **Publish All** to publish the changes, and then select **Trigger** to execute a run.
191
+
4. After you provide the Data Factory pipeline parameters, select **Debug** > **Finish** to invoke a run to validate the configuration. Or, select **Publish** to publish all the changes, and then select **Add trigger** to execute a run.
Copy file name to clipboardExpand all lines: articles/dev-spaces/get-started-java.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ author: stepro
5
5
ms.author: stephpr
6
6
ms.date: 09/26/2018
7
7
ms.topic: tutorial
8
-
description: "Rapid Kubernetes development with containers and microservices on Azure"
8
+
description: "This tutorial shows you how to use Azure Dev Spaces and Visual Studio Code to debug and rapidly iterate a Java application on Azure Kubernetes Service"
9
9
keywords: "Docker, Kubernetes, Azure, AKS, Azure Kubernetes Service, containers, Helm, service mesh, service mesh routing, kubectl, k8s"
0 commit comments