Skip to content

Commit 1c88585

Browse files
committed
update
1 parent 1f08aa1 commit 1c88585

File tree

3 files changed

+50
-29
lines changed

3 files changed

+50
-29
lines changed

articles/azure-monitor/essentials/edge-pipeline-configure.md

Lines changed: 50 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -20,19 +20,9 @@ The pipeline configuration file defines the data flows and cache properties for
2020

2121
:::image type="content" source="media/edge-pipeline/edge-pipeline-configuration.png" lightbox="media/edge-pipeline/edge-pipeline-configuration.png" alt-text="Overview diagram of the dataflow for Azure Monitor edge pipeline." border="false":::
2222

23+
## Supported configurations
2324

24-
The following components are required to enable and configure the Azure Monitor edge pipeline. If you use the Azure portal to configure the edge pipeline, then each of these components is created for you. With other methods, you need to configure each one.
25-
26-
27-
| Component | Description |
28-
|:---|:---|
29-
| Edge pipeline controller extension | Extension added to your Arc-enabled Kubernetes cluster to support pipeline functionality - `microsoft.monitor.pipelinecontroller`. |
30-
| Edge pipeline controller instance | Instance of the edge pipeline running on your Arc-enabled Kubernetes cluster with a set of receivers to accept client data and exporters to deliver that data to Azure Monitor. |
31-
| Pipeline configuration | Configuration file that defines the data flows for the pipeline instance. Each data flow includes a receiver and an exporter. The receiver listens for incoming data, and the exporter sends the data to the destination. |
32-
| Data collection endpoint (DCE) | Endpoint where the data is sent to the Azure Monitor pipeline. The pipeline configuration includes a property for the URL of the DCE so the pipeline instance knows where to send the data. |
33-
| Data collection rule (DCR) | Configuration file that defines how the data is received in the cloud pipeline and where it's sent. The DCR can also include a transformation to filter or modify the data before it's sent to the destination. |
34-
35-
## Supported distros
25+
### Supported distros
3626
Edge pipeline is supported on the following Kubernetes distributions:
3727

3828
- Canonical
@@ -41,8 +31,8 @@ Edge pipeline is supported on the following Kubernetes distributions:
4131
- Rancher Kubernetes Engine
4232
- VMware Tanzu Kubernetes Grid
4333

44-
## Supported locations
45-
Edge pipeline is supported in the following locations:
34+
### Supported locations
35+
Edge pipeline is supported in the following Azure regions:
4636

4737
- East US2
4838
- West US2
@@ -53,6 +43,9 @@ Edge pipeline is supported in the following locations:
5343
- [Arc-enabled Kubernetes cluster](../../azure-arc/kubernetes/overview.md ) in your own environment with an external IP address. See [Connect an existing Kubernetes cluster to Azure Arc](../../azure-arc/kubernetes/quickstart-connect-cluster.md) for details on enabling Arc for a cluster.
5444
- Log Analytics workspace in Azure Monitor to receive the data from the edge pipeline. See [Create a Log Analytics workspace in the Azure portal](../../azure-monitor/logs/quick-create-workspace.md) for details on creating a workspace.
5545

46+
> [!NOTE]
47+
> Private link is support by edge pipeline for the connection to the cloud pipeline.
48+
5649
## Workflow
5750
You don't need a detail understanding of the different steps performed by the Azure Monitor pipeline to configure it using the Azure portal. You may need a more detailed understanding of it though if you use another method of installation or if you need to perform more advanced configuration such as transforming the data before it's stored in its destination.
5851

@@ -70,24 +63,29 @@ The following table and diagram describe the detailed steps and components in th
7063

7164
:::image type="content" source="media/edge-pipeline/edge-pipeline-data-flows.png" lightbox="media/edge-pipeline/edge-pipeline-data-flows.png" alt-text="Detailed diagram of the steps and components for data collection using Azure Monitor edge pipeline." border="false":::
7265

66+
## Components
67+
The following components are required to enable and configure the Azure Monitor edge pipeline. If you use the Azure portal to configure the edge pipeline, then each of these components is created for you. With other methods, you need to configure each one.
7368

7469

75-
## Create table in Log Analytics workspace
70+
| Component | Description |
71+
|:---|:---|
72+
| Edge pipeline controller extension | Extension added to your Arc-enabled Kubernetes cluster to support pipeline functionality - `microsoft.monitor.pipelinecontroller`. |
73+
| Edge pipeline controller instance | Instance of the edge pipeline running on your Arc-enabled Kubernetes cluster with a set of receivers to accept client data and exporters to deliver that data to Azure Monitor. |
74+
| Pipeline configuration | Configuration file that defines the data flows for the pipeline instance. Each data flow includes a receiver and an exporter. The receiver listens for incoming data, and the exporter sends the data to the destination. |
75+
| Data collection endpoint (DCE) | Endpoint where the data is sent to the Azure Monitor pipeline. The pipeline configuration includes a property for the URL of the DCE so the pipeline instance knows where to send the data. |
76+
| Data collection rule (DCR) | Configuration file that defines how the data is received in the cloud pipeline and where it's sent. The DCR can also include a transformation to filter or modify the data before it's sent to the destination. |
7677

77-
Before you configure the data collection process for the edge pipeline, you need to create a table in the Log Analytics workspace to receive the data. This must be a custom table since built-in tables aren't currently supported. The schema of the table must match the data that it receives, but there are multiple steps in the collection process where you can modify the incoming data, so you the table schema doesn't need to match the data that you're collecting. The only requirement for the table in the Log Analytics workspace is that it has a `TimeGenerated` column.
78+
## Create table in Log Analytics workspace
7879

79-
See [Add or delete tables and columns in Azure Monitor Logs](../logs/create-custom-table.md) for details on different methods for creating a table. For example, use the CLI command below to create a table with the following three columns:
80+
Before you configure the data collection process for the edge pipeline, you need to create a table in the Log Analytics workspace to receive the data. This must be a custom table since built-in tables aren't currently supported. The schema of the table must match the data that it receives, but there are multiple steps in the collection process where you can modify the incoming data, so you the table schema doesn't need to match the source data that you're collecting. The only requirement for the table in the Log Analytics workspace is that it has a `TimeGenerated` column.
8081

81-
- TimeGenerated: datetime
82-
- Body: string
83-
- SeverityText: string
82+
See [Add or delete tables and columns in Azure Monitor Logs](../logs/create-custom-table.md) for details on different methods for creating a table. For example, use the CLI command below to create a table with the three columns called `Body`, `TimeGenerated`, and `SeverityText`.
8483

8584
```azurecli
8685
az monitor log-analytics workspace table create --workspace-name my-workspace --resource-group my-resource-group --name my-table_CL --columns TimeGenerated=datetime Body=string SeverityText=string
8786
```
8887

8988

90-
9189
## Enable cache
9290
Edge devices in some environments may experience intermittent connectivity due to various factors such as network congestion, signal interference, power outage, or mobility. In these environments, you can configure the edge pipeline to cache data by creating a [persistent volume](https://kubernetes.io) in your cluster. The process for this will vary based on your particular environment, but the configuration must meet the following requirements:
9391

@@ -105,9 +103,9 @@ The current options for enabling and configuration are detailed in the tabs belo
105103
### [Portal](#tab/Portal)
106104

107105
### Configure pipeline using Azure Portal
108-
When you use the Azure portal to enable and configure the pipeline, all required components are created based on your selections.
106+
When you use the Azure portal to enable and configure the pipeline, all required components are created based on your selections. This saves you from the complexity of creating each component individually, but you made need to use other methods for
109107

110-
From the **Monitor** menu in the Azure portal, select **Pipelines** and then click **Create Azure Monitor pipeline extension**. The **Basic** tab prompts you for the following information to deploy the extension and pipeline instance on your cluster.
108+
From the **Monitor** menu in the Azure portal, select **Pipelines** and then click **Create**. The **Basic** tab prompts you for the following information to deploy the extension and pipeline instance on your cluster.
111109

112110
:::image type="content" source="media/edge-pipeline/create-pipeline.png" lightbox="media/edge-pipeline/create-pipeline.png" alt-text="Screenshot of Create Azure Monitor pipeline screen.":::
113111

@@ -118,8 +116,7 @@ The settings in this tab are described in the following table.
118116
| Instance name | Name for the Azure Monitor pipeline instance. Must be unique for the subscription. |
119117
| Subscription | Azure subscription to create the pipeline instance. |
120118
| Resource group | Resource group to create the pipeline instance. |
121-
| Location | Location of the pipeline instance. |
122-
| Arc K8 Cluster | Select your Arc-enabled Kubernetes cluster. |
119+
| Cluster name | Select your Arc-enabled Kubernetes cluster that the pipeline will be installed on. |
123120
| Custom Location | Custom location for your Arc-enabled Kubernetes cluster. |
124121

125122
The **Dataflow** tab allows you to create and edit dataflows for the pipeline instance. Each dataflow includes the following details:
@@ -133,15 +130,14 @@ The settings in this tab are described in the following table.
133130
| Name | Name for the dataflow. Must be unique for this pipeline. |
134131
| Source type | The type of data being collected. The following source types are currently supported:<br>- Syslog<br>- OTLP |
135132
| Port | Port that the pipeline listens on for incoming data. If two dataflows use the same port, they will both receive and process the data. |
136-
| Destination Type | Destination for the data. Currently, only Log Analytics workspace is supported. |
137133
| Log Analytics Workspace | Log Analytics workspace to send the data to. |
138134
| Table Name | The name of the table in the Log Analytics workspace to send the data to. |
139135

140136

141-
### [ARM](#tab/ARM)
137+
### [CLI](#tab/CLI)
142138

143-
### Configure pipeline using ARM templates
144-
The following sections provide sample ARM templates to create each of the resources required to enable and configure the Azure Monitor edge pipeline.
139+
### Configure pipeline using Azure CLI
140+
Following are the steps required to create and configure the components required for the Azure Monitor edge pipeline using Azure CLI. Not all steps can
145141

146142

147143
### Edge pipeline extension
@@ -208,6 +204,11 @@ The following ARM template creates the custom location for to your Arc-enabled K
208204
}
209205
```
210206

207+
```azurecli
208+
209+
```
210+
211+
211212

212213
### DCE
213214
The following ARM template creates the [data collection endpoint (DCE)](./data-collection-endpoint-overview.md) required for the edge pipeline to connect to the cloud pipeline. You can use an existing DCE if you already have one in the same region. Replace the properties in the following table before deploying the template.
@@ -845,7 +846,24 @@ You can deploy all of the required components for the Azure Monitor edge pipelin
845846

846847
---
847848

849+
## Verify configuration
850+
851+
### Verify pipeline components running in the cluster
852+
In the Azure portal, navigate to the **Kubernetes services** menu and select your Arc-enabled Kubernetes cluster. Select **Services and ingresses** and ensure that you see the following services:
853+
854+
- \<pipeline name\>-external-service
855+
- \<pipeline name\>-service
856+
857+
:::image type="content" source="media/edge-pipeline/heartbeat-records.png" lightbox="media/edge-pipeline/heartbeat-records.png" alt-text="Screenshot of log query that returns heartbeat records for Azure Monitor edge pipeline." border="false":::
848858

859+
Click on the entry for **\<pipeline name\>-external-service** and note the IP address and port in the **Endpoints** column. This is the external IP address and port that your clients will send data to.
860+
861+
## Verify heartbeat
862+
Each pipeline configured in your pipeline instance will send a heartbeat record to the `Heartbeat` table in your Log Analytics workspace every minute. If there are multiple workspaces in the pipeline instance, then the first one configured will be used.
863+
864+
Retrieve the heartbeat records using a log query as in the following example:
865+
866+
:::image type="content" source="media/edge-pipeline/heartbeat-records.png" lightbox="media/edge-pipeline/heartbeat-records.png" alt-text="Screenshot of log query that returns heartbeat records for Azure Monitor edge pipeline." border="false":::
849867

850868

851869
## Client configuration
@@ -866,6 +884,9 @@ If the application producing logs is external to the cluster, copy the *external
866884
| OTLP | The Azure Monitor edge pipeline exposes a gRPC-based OTLP endpoint on port 4317. Configuring your instrumentation to send to this OTLP endpoint will depend on the instrumentation library itself. See [OTLP endpoint or Collector](https://opentelemetry.io/docs/instrumentation/python/exporters/#otlp-endpoint-or-collector) for OpenTelemetry documentation. The environment variable method is documented at [OTLP Exporter Configuration](https://opentelemetry.io/docs/concepts/sdk-configuration/otlp-exporter-configuration/). |
867885

868886

887+
## Verify data
888+
The final step is to verify that the data is received in the Log Analytics workspace.
889+
869890
## Next steps
870891

871892
- [Read more about Azure Monitor pipeline](./pipeline-overview.md).
173 KB
Loading
206 KB
Loading

0 commit comments

Comments
 (0)