You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Following [fields](https://help.sumologic.com/docs/manage/fields/) will always be created automatically as a part of app installation process:
66
59
67
-
<TabItemvalue="k8s">
68
-
69
-
If you're using ActiveMQ in a Kubernetes environment, then these fields will be created:
70
60
*`pod_labels_component`
71
61
*`pod_labels_environment`
72
62
*`pod_labels_messaging_system`
73
63
*`pod_labels_messaging_cluster`
74
64
75
-
</TabItem>
76
-
<TabItemvalue="non-k8s">
77
65
78
-
If you're using ActiveMQ in a non-Kubernetes environment, then these fields will be created:
66
+
If you're using ActiveMQ in a non-Kubernetes environment, these additional fields will get created automatically as a part of app installation process:
79
67
*`component`
80
68
*`environment`
81
69
*`messaging_system`
82
70
*`messaging_cluster`
83
71
*`pod`
84
72
85
-
</TabItem>
86
-
</Tabs>
73
+
For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields).
87
74
88
75
89
76
### Step 2: Configure ActiveMQ Logs and Metrics Collection
@@ -269,7 +256,7 @@ This section explains the steps to collect ActiveMQ logs from a Kubernetes envir
269
256
```
270
257
5. Sumo Logic Kubernetes collection will automatically start collecting logs from the pods having the annotations defined above.
271
258
272
-
3. **FERs to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we will have a Field Extraction Rule automatically created for Messaging Application Components named as**App Observability - Messaging**
259
+
3. **FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we will have a Field Extraction Rule automatically created with named as**AppObservabilityMessagingActiveMQFER**
273
260
</TabItem>
274
261
<TabItem value="non-k8s">
275
262
@@ -440,7 +427,7 @@ At this point, ActiveMQ logs should start flowing into Sumo Logic.
This section provides instructions for configuring log and metric collection for the Sumo Logic App for Kafka.
69
69
70
-
### Configure Fields in Sumo Logic
70
+
### Fields in Sumo Logic
71
71
72
-
Create the following Fields in Sumo Logic prior to configuring collection. This ensures that your logs and metrics are tagged with relevant metadata, which is required by the app dashboards. For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields).
72
+
Following [fields](https://help.sumologic.com/docs/manage/fields/) will always be created automatically as a part of app installation process:
If you're using Kafka in a Kubernetes environment, create the fields:
85
74
*`pod_labels_component`
86
75
*`pod_labels_environment`
87
76
*`pod_labels_messaging_system`
88
77
*`pod_labels_messaging_cluster`
89
78
90
-
</TabItem>
91
-
<TabItemvalue="non-k8s">
92
79
93
-
If you're using Kafka in a non-Kubernetes environment, create the fields:
80
+
If you're using ActiveMQ in a non-Kubernetes environment, these additional fields will get created automatically as a part of app installation process:
94
81
*`component`
95
82
*`environment`
96
83
*`messaging_system`
97
84
*`messaging_cluster`
98
85
*`pod`
99
86
100
-
</TabItem>
101
-
</Tabs>
87
+
For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields).
102
88
103
89
### Configure Collection for Kafka
104
90
@@ -230,30 +216,7 @@ This section explains the steps to collect Kafka logs from a Kubernetes environm
230
216
kubectl describe pod <Kafka_pod_name>
231
217
```
232
218
5. Sumo Logic Kubernetes collection will automatically start collecting logs from the pods having the annotations defined above.
233
-
3. **Add an FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we need to create a Field Extraction Rule if not already created for Messaging Application Components. To do so:
234
-
1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Field Extraction Rules**. <br/>[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Field Extraction Rules**. You can also click the **Go To...** menu at the top of the screen and select **Field Extraction Rules**.
235
-
2. Click the **+ Add** button on the top right of the table.
236
-
3. The **Add Field Extraction Rule** form will appear. Enter the following options:
237
-
* **Rule Name**. Enter the name as **App Component Observability - Messaging.**
238
-
* **Applied At**. Choose Ingest Time
239
-
* **Scope**. Select Specific Data
240
-
* Scope: Enter the following keyword search expression:
* **Parse Expression**. Enter the following parse expression:
246
-
```sql
247
-
if (!isEmpty(pod_labels_environment), pod_labels_environment, "") as environment
248
-
| pod_labels_component as component
249
-
| pod_labels_messaging_system as messaging_system
250
-
| pod_labels_messaging_cluster as messaging_cluster
251
-
```
252
-
4. Click **Save** to create the rule.
253
-
5. Verify logs are flowing into Sumo Logic by running the following logs query:
254
-
```sql
255
-
component="messaging" and messaging_system="kafka"
256
-
```
219
+
3. **FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we will have a Field Extraction Rule automatically created with named as **AppObservabilityMessagingKafkaFER**
257
220
258
221
</TabItem>
259
222
<TabItemvalue="non-k8s">
@@ -390,93 +353,6 @@ At this point, Kafka metrics and logs should start flowing into Sumo Logic.
390
353
</TabItem>
391
354
</Tabs>
392
355
393
-
## Installing Kafka Alerts
394
-
395
-
This section and below provide instructions for installing the Sumo App and Alerts for Kafka and descriptions of each of the app dashboards. These instructions assume you have already set up the collection as described in [Collect Logs and Metrics for Kafka](#collecting-logs-and-metrics-for-kafka).
396
-
397
-
#### Pre-Packaged Alerts
398
-
399
-
Sumo Logic has provided out-of-the-box alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you quickly determine if the Kafka cluster is available and performing as expected. These alerts are built based on metrics datasets and have preset thresholds based on industry best practices and recommendations. See [Kafka Alerts](#kafka-alerts) for more details.
400
-
401
-
* To install these alerts, you need to have the Manage Monitors role capability.
402
-
* Alerts can be installed by either importing a JSON or a Terraform script.
403
-
* There are limits to how many alerts can be enabled - see the [Alerts FAQ](/docs/alerts/monitors/monitor-faq) for details.
404
-
405
-
406
-
### Method A: Importing a JSON file
407
-
408
-
1. Download a[ JSON file](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/kubernetes/kubernetes.json) that describes the monitors.
409
-
1. The [JSON](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/Kafka/Kafka_Alerts.json) contains the alerts that are based on Sumo Logic searches that do not have any scope filters and therefore will be applicable to all Kafka clusters, the data for which has been collected via the instructions in the previous sections. However, if you would like to restrict these alerts to specific clusters or environments, update the JSON file by replacing the text `'messaging_system=kafka `with `'<Your Custom Filter>`. Custom filter examples:
410
-
* For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=Kafka-prod.01`
411
-
* For alerts applicable to all clusters that start with Kafka-prod, your custom filter would be: `messaging_cluster=Kafka-prod*`
412
-
* For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=Kafka-1` and `environment=prod` (This assumes you have set the optional environment tag while configuring collection)
413
-
2.[**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Monitoring > Monitors**. <br/>[**New UI**](/docs/get-started/sumo-logic-ui). In the main Sumo Logic menu, select **Alerts > Monitors**. You can also click the **Go To...** menu at the top of the screen and select **Monitors**.
414
-
3. Click **Add**
415
-
4. Click Import to import monitors from the JSON above.
416
-
417
-
The monitors are disabled by default. Once you have installed the alerts using this method, navigate to the Kafka folder under Monitors to configure them. See [this](/docs/alerts/monitors) document to enable monitors. To send notifications to teams or connections, see the instructions detailed in Step 4 of this [document](/docs/alerts/monitors/create-monitor).
418
-
419
-
### Method B: Using a Terraform script
420
-
421
-
1. Generate an access key and access ID for a user that has the Manage Monitors role capability in Sumo Logic using instructions in [Access Keys](/docs/manage/security/access-keys). Identify which deployment your Sumo Logic account is in using [this link](/docs/api/getting-started#sumo-logic-endpoints-by-deployment-and-firewall-security).
422
-
2.[Download and install Terraform 0.13](https://www.terraform.io/downloads.html) or later.
423
-
3. Download the Sumo Logic Terraform package for Kafka alerts. The alerts package is available in the Sumo Logic [GitHub repository](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/tree/main/monitor_packages/Kafka). You can either download it through the “git clone” command or as a zip file.
424
-
4. Alert Configuration. After the package has been extracted, navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/Kafka`.
425
-
1. Edit the `monitor.auto.tfvars` file and add the Sumo Logic Access Key, Access Id and Deployment from Step 1.
426
-
```bash
427
-
access_id = "<SUMOLOGIC ACCESS ID>"
428
-
access_key = "<SUMOLOGIC ACCESS KEY>"
429
-
environment = "<SUMOLOGIC DEPLOYMENT>"
430
-
```
431
-
2. The Terraform script installs the alerts without any scope filters, if you would like to restrict the alerts to specific clusters or environments, update the variable `’kafka_data_source’`. Custom filter examples:
432
-
* For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=Kafka-prod.01`
433
-
* For alerts applicable to all clusters that start with Kafka-prod, your custom filter would be: `messaging_cluster=Kafka-prod*`
434
-
* For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=Kafka-1` and `environment=prod`. This assumes you have set the optional environment tag while configuring collection.
435
-
436
-
All monitors are disabled by default on installation, if you would like to enable all the monitors, set the parameter `monitors_disabled` to `false`in this file.
437
-
438
-
By default, the monitors are configured in a monitor folder called “Kafka”, if you would like to change the name of the folder, update the monitor folder name in this file.
439
-
440
-
5. To send email or connection notifications, modify the file `notifications.auto.tfvars` file and fill in the `connection_notifications` and `email_notifications` sections. See the examples for PagerDuty and email notifications below. See [this document](/docs/alerts/webhook-connections/set-up-webhook-connections) for creating payloads with other connection types.
Replace `<CONNECTION_ID>` with the connection id of the webhook connection. The webhook connection id can be retrieved by calling the[ Monitors API](https://api.sumologic.com/docs/#operation/listConnections).
1. Navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/Kafka/` and run terraform init. This will initialize Terraform and will download the required components.
476
-
2. Run `terraform plan` to view the monitors which will be created/modified by Terraform.
477
-
3. Run `terraform apply`.
478
-
7. **Post Installation.** If you haven’t enabled alerts and/or configured notifications through the Terraform procedure outlined above, we highly recommend enabling alerts of interest and configuring each enabled alert to send notifications to other people or services. This is detailed in Step 4 of[ this document](/docs/alerts/monitors/create-monitor).
479
-
480
356
481
357
## Installing the Kafka App
482
358
@@ -726,6 +602,16 @@ Use this dashboard to:
726
602
727
603
## Kafka Alerts
728
604
605
+
#### Pre-Packaged Alerts
606
+
607
+
Sumo Logic has provided out-of-the-box alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you quickly determine if the Kafka cluster is available and performing as expected. These alerts are built based on metrics datasets and have preset thresholds based on industry best practices and recommendations.
608
+
609
+
610
+
* There are limits to how many alerts can be enabled - see the [Alerts FAQ](/docs/alerts/monitors/monitor-faq) for details.
611
+
:::note permissions required
612
+
To install these alerts, you need to have the [Manage Monitors role capability](/docs/manage/users-roles/roles/role-capabilities/#alerting).
613
+
:::
614
+
729
615
| Alert Name | Alert Description and conditions | Alert Condition | Recover Condition |
0 commit comments