You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/essentials/edge-pipeline-configure.md
+50-29Lines changed: 50 additions & 29 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,19 +20,9 @@ The pipeline configuration file defines the data flows and cache properties for
20
20
21
21
:::image type="content" source="media/edge-pipeline/edge-pipeline-configuration.png" lightbox="media/edge-pipeline/edge-pipeline-configuration.png" alt-text="Overview diagram of the dataflow for Azure Monitor edge pipeline." border="false":::
22
22
23
+
## Supported configurations
23
24
24
-
The following components are required to enable and configure the Azure Monitor edge pipeline. If you use the Azure portal to configure the edge pipeline, then each of these components is created for you. With other methods, you need to configure each one.
25
-
26
-
27
-
| Component | Description |
28
-
|:---|:---|
29
-
| Edge pipeline controller extension | Extension added to your Arc-enabled Kubernetes cluster to support pipeline functionality - `microsoft.monitor.pipelinecontroller`. |
30
-
| Edge pipeline controller instance | Instance of the edge pipeline running on your Arc-enabled Kubernetes cluster with a set of receivers to accept client data and exporters to deliver that data to Azure Monitor. |
31
-
| Pipeline configuration | Configuration file that defines the data flows for the pipeline instance. Each data flow includes a receiver and an exporter. The receiver listens for incoming data, and the exporter sends the data to the destination. |
32
-
| Data collection endpoint (DCE) | Endpoint where the data is sent to the Azure Monitor pipeline. The pipeline configuration includes a property for the URL of the DCE so the pipeline instance knows where to send the data. |
33
-
| Data collection rule (DCR) | Configuration file that defines how the data is received in the cloud pipeline and where it's sent. The DCR can also include a transformation to filter or modify the data before it's sent to the destination. |
34
-
35
-
## Supported distros
25
+
### Supported distros
36
26
Edge pipeline is supported on the following Kubernetes distributions:
37
27
38
28
- Canonical
@@ -41,8 +31,8 @@ Edge pipeline is supported on the following Kubernetes distributions:
41
31
- Rancher Kubernetes Engine
42
32
- VMware Tanzu Kubernetes Grid
43
33
44
-
## Supported locations
45
-
Edge pipeline is supported in the following locations:
34
+
###Supported locations
35
+
Edge pipeline is supported in the following Azure regions:
46
36
47
37
- East US2
48
38
- West US2
@@ -53,6 +43,9 @@ Edge pipeline is supported in the following locations:
53
43
-[Arc-enabled Kubernetes cluster](../../azure-arc/kubernetes/overview.md) in your own environment with an external IP address. See [Connect an existing Kubernetes cluster to Azure Arc](../../azure-arc/kubernetes/quickstart-connect-cluster.md) for details on enabling Arc for a cluster.
54
44
- Log Analytics workspace in Azure Monitor to receive the data from the edge pipeline. See [Create a Log Analytics workspace in the Azure portal](../../azure-monitor/logs/quick-create-workspace.md) for details on creating a workspace.
55
45
46
+
> [!NOTE]
47
+
> Private link is support by edge pipeline for the connection to the cloud pipeline.
48
+
56
49
## Workflow
57
50
You don't need a detail understanding of the different steps performed by the Azure Monitor pipeline to configure it using the Azure portal. You may need a more detailed understanding of it though if you use another method of installation or if you need to perform more advanced configuration such as transforming the data before it's stored in its destination.
58
51
@@ -70,24 +63,29 @@ The following table and diagram describe the detailed steps and components in th
70
63
71
64
:::image type="content" source="media/edge-pipeline/edge-pipeline-data-flows.png" lightbox="media/edge-pipeline/edge-pipeline-data-flows.png" alt-text="Detailed diagram of the steps and components for data collection using Azure Monitor edge pipeline." border="false":::
72
65
66
+
## Components
67
+
The following components are required to enable and configure the Azure Monitor edge pipeline. If you use the Azure portal to configure the edge pipeline, then each of these components is created for you. With other methods, you need to configure each one.
73
68
74
69
75
-
## Create table in Log Analytics workspace
70
+
| Component | Description |
71
+
|:---|:---|
72
+
| Edge pipeline controller extension | Extension added to your Arc-enabled Kubernetes cluster to support pipeline functionality - `microsoft.monitor.pipelinecontroller`. |
73
+
| Edge pipeline controller instance | Instance of the edge pipeline running on your Arc-enabled Kubernetes cluster with a set of receivers to accept client data and exporters to deliver that data to Azure Monitor. |
74
+
| Pipeline configuration | Configuration file that defines the data flows for the pipeline instance. Each data flow includes a receiver and an exporter. The receiver listens for incoming data, and the exporter sends the data to the destination. |
75
+
| Data collection endpoint (DCE) | Endpoint where the data is sent to the Azure Monitor pipeline. The pipeline configuration includes a property for the URL of the DCE so the pipeline instance knows where to send the data. |
76
+
| Data collection rule (DCR) | Configuration file that defines how the data is received in the cloud pipeline and where it's sent. The DCR can also include a transformation to filter or modify the data before it's sent to the destination. |
76
77
77
-
Before you configure the data collection process for the edge pipeline, you need to create a table in the Log Analytics workspace to receive the data. This must be a custom table since built-in tables aren't currently supported. The schema of the table must match the data that it receives, but there are multiple steps in the collection process where you can modify the incoming data, so you the table schema doesn't need to match the data that you're collecting. The only requirement for the table in the Log Analytics workspace is that it has a `TimeGenerated` column.
78
+
## Create table in Log Analytics workspace
78
79
79
-
See [Add or delete tables and columns in Azure Monitor Logs](../logs/create-custom-table.md) for details on different methods for creating a table. For example, use the CLI command below to create a table with the following three columns:
80
+
Before you configure the data collection process for the edge pipeline, you need to create a table in the Log Analytics workspace to receive the data. This must be a custom table since built-in tables aren't currently supported. The schema of the table must match the data that it receives, but there are multiple steps in the collection process where you can modify the incoming data, so you the table schema doesn't need to match the source data that you're collecting. The only requirement for the table in the Log Analytics workspace is that it has a `TimeGenerated` column.
80
81
81
-
- TimeGenerated: datetime
82
-
- Body: string
83
-
- SeverityText: string
82
+
See [Add or delete tables and columns in Azure Monitor Logs](../logs/create-custom-table.md) for details on different methods for creating a table. For example, use the CLI command below to create a table with the three columns called `Body`, `TimeGenerated`, and `SeverityText`.
Edge devices in some environments may experience intermittent connectivity due to various factors such as network congestion, signal interference, power outage, or mobility. In these environments, you can configure the edge pipeline to cache data by creating a [persistent volume](https://kubernetes.io) in your cluster. The process for this will vary based on your particular environment, but the configuration must meet the following requirements:
93
91
@@ -105,9 +103,9 @@ The current options for enabling and configuration are detailed in the tabs belo
105
103
### [Portal](#tab/Portal)
106
104
107
105
### Configure pipeline using Azure Portal
108
-
When you use the Azure portal to enable and configure the pipeline, all required components are created based on your selections.
106
+
When you use the Azure portal to enable and configure the pipeline, all required components are created based on your selections. This saves you from the complexity of creating each component individually, but you made need to use other methods for
109
107
110
-
From the **Monitor** menu in the Azure portal, select **Pipelines** and then click **Create Azure Monitor pipeline extension**. The **Basic** tab prompts you for the following information to deploy the extension and pipeline instance on your cluster.
108
+
From the **Monitor** menu in the Azure portal, select **Pipelines** and then click **Create**. The **Basic** tab prompts you for the following information to deploy the extension and pipeline instance on your cluster.
| Cluster name | Select your Arc-enabled Kubernetes cluster that the pipeline will be installed on. |
123
120
| Custom Location | Custom location for your Arc-enabled Kubernetes cluster. |
124
121
125
122
The **Dataflow** tab allows you to create and edit dataflows for the pipeline instance. Each dataflow includes the following details:
@@ -133,15 +130,14 @@ The settings in this tab are described in the following table.
133
130
| Name | Name for the dataflow. Must be unique for this pipeline. |
134
131
| Source type | The type of data being collected. The following source types are currently supported:<br>- Syslog<br>- OTLP |
135
132
| Port | Port that the pipeline listens on for incoming data. If two dataflows use the same port, they will both receive and process the data. |
136
-
| Destination Type | Destination for the data. Currently, only Log Analytics workspace is supported. |
137
133
| Log Analytics Workspace | Log Analytics workspace to send the data to. |
138
134
| Table Name | The name of the table in the Log Analytics workspace to send the data to. |
139
135
140
136
141
-
### [ARM](#tab/ARM)
137
+
### [CLI](#tab/CLI)
142
138
143
-
### Configure pipeline using ARM templates
144
-
The following sections provide sample ARM templates to create each of the resources required to enable and configure the Azure Monitor edge pipeline.
139
+
### Configure pipeline using Azure CLI
140
+
Following are the steps required to create and configure the components required for the Azure Monitor edge pipeline using Azure CLI. Not all steps can
145
141
146
142
147
143
### Edge pipeline extension
@@ -208,6 +204,11 @@ The following ARM template creates the custom location for to your Arc-enabled K
208
204
}
209
205
```
210
206
207
+
```azurecli
208
+
209
+
```
210
+
211
+
211
212
212
213
### DCE
213
214
The following ARM template creates the [data collection endpoint (DCE)](./data-collection-endpoint-overview.md) required for the edge pipeline to connect to the cloud pipeline. You can use an existing DCE if you already have one in the same region. Replace the properties in the following table before deploying the template.
@@ -845,7 +846,24 @@ You can deploy all of the required components for the Azure Monitor edge pipelin
845
846
846
847
---
847
848
849
+
## Verify configuration
850
+
851
+
### Verify pipeline components running in the cluster
852
+
In the Azure portal, navigate to the **Kubernetes services** menu and select your Arc-enabled Kubernetes cluster. Select **Services and ingresses** and ensure that you see the following services:
853
+
854
+
-\<pipeline name\>-external-service
855
+
-\<pipeline name\>-service
856
+
857
+
:::image type="content" source="media/edge-pipeline/heartbeat-records.png" lightbox="media/edge-pipeline/heartbeat-records.png" alt-text="Screenshot of log query that returns heartbeat records for Azure Monitor edge pipeline." border="false":::
848
858
859
+
Click on the entry for **\<pipeline name\>-external-service** and note the IP address and port in the **Endpoints** column. This is the external IP address and port that your clients will send data to.
860
+
861
+
## Verify heartbeat
862
+
Each pipeline configured in your pipeline instance will send a heartbeat record to the `Heartbeat` table in your Log Analytics workspace every minute. If there are multiple workspaces in the pipeline instance, then the first one configured will be used.
863
+
864
+
Retrieve the heartbeat records using a log query as in the following example:
865
+
866
+
:::image type="content" source="media/edge-pipeline/heartbeat-records.png" lightbox="media/edge-pipeline/heartbeat-records.png" alt-text="Screenshot of log query that returns heartbeat records for Azure Monitor edge pipeline." border="false":::
849
867
850
868
851
869
## Client configuration
@@ -866,6 +884,9 @@ If the application producing logs is external to the cluster, copy the *external
866
884
| OTLP | The Azure Monitor edge pipeline exposes a gRPC-based OTLP endpoint on port 4317. Configuring your instrumentation to send to this OTLP endpoint will depend on the instrumentation library itself. See [OTLP endpoint or Collector](https://opentelemetry.io/docs/instrumentation/python/exporters/#otlp-endpoint-or-collector) for OpenTelemetry documentation. The environment variable method is documented at [OTLP Exporter Configuration](https://opentelemetry.io/docs/concepts/sdk-configuration/otlp-exporter-configuration/). |
867
885
868
886
887
+
## Verify data
888
+
The final step is to verify that the data is received in the Log Analytics workspace.
889
+
869
890
## Next steps
870
891
871
892
-[Read more about Azure Monitor pipeline](./pipeline-overview.md).
0 commit comments