You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .clabot
+2-1Lines changed: 2 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -182,7 +182,8 @@
182
182
"Deklin",
183
183
"justrelax19",
184
184
"dlindelof-sumologic",
185
-
"snyk-bot"
185
+
"snyk-bot",
186
+
"stephenthedev"
186
187
],
187
188
"message": "Thank you for your contribution! As this is an open source project, we require contributors to sign our Contributor License Agreement and do not have yours on file. To proceed with your PR, please [sign your name here](https://forms.gle/YgLddrckeJaCdZYA6) and we will add you to our approved list of contributors.",
Copy file name to clipboardExpand all lines: docs/api/metrics-query.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ import ApiRoles from '../reuse/api-roles.md';
13
13
14
14
The Metrics Query API allows you to execute queries on various metrics and retrieve multiple time-series (data-points) over time from HTTP endpoints. For information about running a metrics query using the API, see [Executing a query](/docs/api/metrics/#executing-a-query) in *Metrics APIs*.
15
15
16
-
Here is example content for a `v1/metricQueries` API call:
16
+
Here is example content for a `v1/metricsQueries` API call:
To track Admin activity in your AWS account, and to provide data for all Administrator Activity panels in the User Monitoring Dashboard, you'll need to inform Sumo Logic for the Admin AWS account. You can do this by uploading a CSV file via HTTP Source.
@@ -309,4 +308,4 @@ See information about S3 public objects and buckets, including counts of new pub
309
308
## Additional resources
310
309
311
310
* Blog: [What is AWS CloudTrail?](https://www.sumologic.com/blog/what-is-aws-cloudtrail/)
312
-
* App description: [Logs for Security app for AWS CloudTrail](https://www.sumologic.com/application/aws-cloudtrail/)
311
+
* App description: [Logs for Security app for AWS CloudTrail](https://www.sumologic.com/application/aws-cloudtrail/)
This section provides instructions for configuring log and metric collection for the Sumo Logic App for Kafka.
69
69
70
-
### Configure Fields in Sumo Logic
71
-
72
-
Create the following Fields in Sumo Logic prior to configuring collection. This ensures that your logs and metrics are tagged with relevant metadata, which is required by the app dashboards. For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields).
If you're using Kafka in a Kubernetes environment, create the fields:
85
-
*`pod_labels_component`
86
-
*`pod_labels_environment`
87
-
*`pod_labels_messaging_system`
88
-
*`pod_labels_messaging_cluster`
89
-
90
-
</TabItem>
91
-
<TabItemvalue="non-k8s">
92
-
93
-
If you're using Kafka in a non-Kubernetes environment, create the fields:
94
-
*`component`
95
-
*`environment`
96
-
*`messaging_system`
97
-
*`messaging_cluster`
98
-
*`pod`
99
-
100
-
</TabItem>
101
-
</Tabs>
102
-
103
-
### Configure Collection for Kafka
70
+
### Configure collection for Kafka
104
71
105
72
Sumo Logic supports collection of logs and metrics data from Kafka in both Kubernetes and non-Kubernetes environments.
106
73
@@ -230,30 +197,7 @@ This section explains the steps to collect Kafka logs from a Kubernetes environm
230
197
kubectl describe pod <Kafka_pod_name>
231
198
```
232
199
5. Sumo Logic Kubernetes collection will automatically start collecting logs from the pods having the annotations defined above.
233
-
3. **Add an FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we need to create a Field Extraction Rule if not already created for Messaging Application Components. To do so:
234
-
1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Field Extraction Rules**. <br/>[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Field Extraction Rules**. You can also click the **Go To...** menu at the top of the screen and select **Field Extraction Rules**.
235
-
2. Click the **+ Add** button on the top right of the table.
236
-
3. The **Add Field Extraction Rule** form will appear. Enter the following options:
237
-
* **Rule Name**. Enter the name as **App Component Observability - Messaging.**
238
-
* **Applied At**. Choose Ingest Time
239
-
* **Scope**. Select Specific Data
240
-
* Scope: Enter the following keyword search expression:
* **Parse Expression**. Enter the following parse expression:
246
-
```sql
247
-
if (!isEmpty(pod_labels_environment), pod_labels_environment, "") as environment
248
-
| pod_labels_component as component
249
-
| pod_labels_messaging_system as messaging_system
250
-
| pod_labels_messaging_cluster as messaging_cluster
251
-
```
252
-
4. Click **Save** to create the rule.
253
-
5. Verify logs are flowing into Sumo Logic by running the following logs query:
254
-
```sql
255
-
component="messaging" and messaging_system="kafka"
256
-
```
200
+
3. **FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, a Field Extraction Rule named **AppObservabilityMessagingKafkaFER** is automatically created.
257
201
258
202
</TabItem>
259
203
<TabItemvalue="non-k8s">
@@ -390,123 +334,31 @@ At this point, Kafka metrics and logs should start flowing into Sumo Logic.
390
334
</TabItem>
391
335
</Tabs>
392
336
393
-
## Installing Kafka Alerts
394
-
395
-
This section and below provide instructions for installing the Sumo App and Alerts for Kafka and descriptions of each of the app dashboards. These instructions assume you have already set up the collection as described in [Collect Logs and Metrics for Kafka](#collecting-logs-and-metrics-for-kafka).
396
337
397
-
#### Pre-Packaged Alerts
338
+
##Installing the Kafka app
398
339
399
-
Sumo Logic has provided out-of-the-box alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you quickly determine if the Kafka cluster is available and performing as expected. These alerts are built based on metrics datasets and have preset thresholds based on industry best practices and recommendations. See [Kafka Alerts](#kafka-alerts) for more details.
340
+
import AppInstall2 from '../../reuse/apps/app-install-sc-k8s.md';
400
341
401
-
* To install these alerts, you need to have the Manage Monitors role capability.
402
-
* Alerts can be installed by either importing a JSON or a Terraform script.
403
-
* There are limits to how many alerts can be enabled - see the [Alerts FAQ](/docs/alerts/monitors/monitor-faq) for details.
342
+
<AppInstall2/>
404
343
344
+
As part of the app installation process, the following fields will be created by default:
345
+
*`component`
346
+
*`environment`
347
+
*`messaging_system`
348
+
*`messaging_cluster`
349
+
*`pod`
405
350
406
-
### Method A: Importing a JSON file
407
-
408
-
1. Download a[ JSON file](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/kubernetes/kubernetes.json) that describes the monitors.
409
-
1. The [JSON](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/Kafka/Kafka_Alerts.json) contains the alerts that are based on Sumo Logic searches that do not have any scope filters and therefore will be applicable to all Kafka clusters, the data for which has been collected via the instructions in the previous sections. However, if you would like to restrict these alerts to specific clusters or environments, update the JSON file by replacing the text `'messaging_system=kafka `with `'<Your Custom Filter>`. Custom filter examples:
410
-
* For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=Kafka-prod.01`
411
-
* For alerts applicable to all clusters that start with Kafka-prod, your custom filter would be: `messaging_cluster=Kafka-prod*`
412
-
* For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=Kafka-1` and `environment=prod` (This assumes you have set the optional environment tag while configuring collection)
413
-
2.[**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Monitoring > Monitors**. <br/>[**New UI**](/docs/get-started/sumo-logic-ui). In the main Sumo Logic menu, select **Alerts > Monitors**. You can also click the **Go To...** menu at the top of the screen and select **Monitors**.
414
-
3. Click **Add**
415
-
4. Click Import to import monitors from the JSON above.
416
-
417
-
The monitors are disabled by default. Once you have installed the alerts using this method, navigate to the Kafka folder under Monitors to configure them. See [this](/docs/alerts/monitors) document to enable monitors. To send notifications to teams or connections, see the instructions detailed in Step 4 of this [document](/docs/alerts/monitors/create-monitor).
418
-
419
-
### Method B: Using a Terraform script
420
-
421
-
1. Generate an access key and access ID for a user that has the Manage Monitors role capability in Sumo Logic using instructions in [Access Keys](/docs/manage/security/access-keys). Identify which deployment your Sumo Logic account is in using [this link](/docs/api/getting-started#sumo-logic-endpoints-by-deployment-and-firewall-security).
422
-
2.[Download and install Terraform 0.13](https://www.terraform.io/downloads.html) or later.
423
-
3. Download the Sumo Logic Terraform package for Kafka alerts. The alerts package is available in the Sumo Logic [GitHub repository](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/tree/main/monitor_packages/Kafka). You can either download it through the “git clone” command or as a zip file.
424
-
4. Alert Configuration. After the package has been extracted, navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/Kafka`.
425
-
1. Edit the `monitor.auto.tfvars` file and add the Sumo Logic Access Key, Access Id and Deployment from Step 1.
426
-
```bash
427
-
access_id = "<SUMOLOGIC ACCESS ID>"
428
-
access_key = "<SUMOLOGIC ACCESS KEY>"
429
-
environment = "<SUMOLOGIC DEPLOYMENT>"
430
-
```
431
-
2. The Terraform script installs the alerts without any scope filters, if you would like to restrict the alerts to specific clusters or environments, update the variable `’kafka_data_source’`. Custom filter examples:
432
-
* For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=Kafka-prod.01`
433
-
* For alerts applicable to all clusters that start with Kafka-prod, your custom filter would be: `messaging_cluster=Kafka-prod*`
434
-
* For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=Kafka-1` and `environment=prod`. This assumes you have set the optional environment tag while configuring collection.
435
-
436
-
All monitors are disabled by default on installation, if you would like to enable all the monitors, set the parameter `monitors_disabled` to `false`in this file.
437
-
438
-
By default, the monitors are configured in a monitor folder called “Kafka”, if you would like to change the name of the folder, update the monitor folder name in this file.
439
-
440
-
5. To send email or connection notifications, modify the file `notifications.auto.tfvars` file and fill in the `connection_notifications` and `email_notifications` sections. See the examples for PagerDuty and email notifications below. See [this document](/docs/alerts/webhook-connections/set-up-webhook-connections) for creating payloads with other connection types.
Replace `<CONNECTION_ID>` with the connection id of the webhook connection. The webhook connection id can be retrieved by calling the[ Monitors API](https://api.sumologic.com/docs/#operation/listConnections).
1. Navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/Kafka/` and run terraform init. This will initialize Terraform and will download the required components.
476
-
2. Run `terraform plan` to view the monitors which will be created/modified by Terraform.
477
-
3. Run `terraform apply`.
478
-
7. **Post Installation.** If you haven’t enabled alerts and/or configured notifications through the Terraform procedure outlined above, we highly recommend enabling alerts of interest and configuring each enabled alert to send notifications to other people or services. This is detailed in Step 4 of[ this document](/docs/alerts/monitors/create-monitor).
479
-
480
-
481
-
## Installing the Kafka App
482
-
483
-
This section demonstrates how to install the Kafka App.
484
-
485
-
Locate and install the app you need from the **App Catalog**. If you want to see a preview of the dashboards included with the app before installing, click **Preview Dashboards**.
486
-
487
-
1. From the **App Catalog**, search for and selectthe app.
488
-
2. Select the version of the service you're using and click **Add to Library**. :::note
489
-
Version selection is not available for all apps.
490
-
:::
491
-
3. To install the app, complete the following fields.
492
-
* **App Name.** You can retain the existing name, or enter a name of your choice for the app.
493
-
* **Data Source.** Choose **Enter a Custom Data Filter**, and enter a custom Kafka cluster filter. Examples:
494
-
* For all Kafka clusters `messaging_cluster=*`
495
-
* For a specific cluster: `messaging_cluster=Kafka.dev.01`.
496
-
* Clusters within a specific environment: `messaging_cluster=Kafka-1 and environment=prod`. This assumes you have set the optional environment tag while configuring collection.
497
-
4. **Advanced**. Select the **Location in Library** (the default is the Personal folder in the library), or click **New Folder** to add a new folder.
498
-
5. Click **Add to Library**.
499
-
500
-
When an app is installed, it will appear in your **Personal** folder, or another folder that you specified. From here, you can share it with your organization.
501
-
502
-
Panels will start to fill automatically. It's important to note that each panel slowly fills with data matching the time range query and received since the panel was created. Results won't immediately be available, but with a bit of time, you'll see full graphs and maps.
503
-
351
+
If you're using Kafka in a Kubernetes environment, the following additional fields will be automatically created as a part of the app installation process:
352
+
*`pod_labels_component`
353
+
*`pod_labels_environment`
354
+
*`pod_labels_messaging_system`
355
+
*`pod_labels_messaging_cluster`
504
356
505
357
## Viewing the Kafka Dashboards
506
358
507
-
### Filters with Template Variables
359
+
import ViewDashboards from '../../reuse/apps/view-dashboards.md';
508
360
509
-
Template variables provide dynamic dashboards that rescope data on the fly. As you apply variables to troubleshoot through your dashboard, you can view dynamic changes to the data for a fast resolution to the root cause. For more information, see the Filter with template variables help page.
0 commit comments