diff --git a/docs/integrations/app-development/bitbucket.md b/docs/integrations/app-development/bitbucket.md index 277a64aefd..7641bdca57 100644 --- a/docs/integrations/app-development/bitbucket.md +++ b/docs/integrations/app-development/bitbucket.md @@ -144,10 +144,8 @@ For reference: This is how the [bitbucket-pipelines.yml](https://bitbucket.org/a ### Step 4: Enable Bitbucket Event-Key tagging at Sumo Logic -Sumo Logic needs to understand the event type for incoming events (for example, repo:push events). To enable this, the [X-Event-Key](https://confluence.atlassian.com/bitbucket/event-payloads-740262817.html#EventPayloads-HTTPheaders) event type needs to be enabled. To enable this, perform the following steps in the Sumo Logic console: +Sumo Logic needs to understand the event type for incoming events (for example, repo:push events). To enable this, the [X-Event-Key](https://confluence.atlassian.com/bitbucket/event-payloads-740262817.html#EventPayloads-HTTPheaders) event type is automatically added to the [Fields](/docs/manage/fields) during installation of the app. -1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Fields**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Fields**. You can also click the **Go To...** menu at the top of the screen and select **Fields**. -2. Add Field ‎**X-Event-Key**‎.
Bitbucket ## Installing the Bitbucket App diff --git a/docs/integrations/app-development/github.md b/docs/integrations/app-development/github.md index 3b6ce86e1c..d7df402cf4 100644 --- a/docs/integrations/app-development/github.md +++ b/docs/integrations/app-development/github.md @@ -158,10 +158,7 @@ To configure a GitHub Webhook: ### Enable GitHub Event tagging at Sumo Logic -Sumo Logic needs to understand the event type for incoming events. To enable this, the [x-github-event](https://docs.github.com/en/developers/webhooks-and-events/webhook-events-and-payloads) event type needs to be enabled. To enable this, perform the following steps in the Sumo Logic console: - -1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Fields**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Fields**. You can also click the **Go To...** menu at the top of the screen and select **Fields**. -2. Add Field ‎**x-github-event**‎.
Field_GitHub +Sumo Logic needs to understand the event type for incoming events. To enable this, the [x-github-event](https://docs.github.com/en/developers/webhooks-and-events/webhook-events-and-payloads) event type is automatically added to the [Fields](/docs/manage/fields) during installation of the app. ## Installing the GitHub App diff --git a/docs/integrations/app-development/gitlab.md b/docs/integrations/app-development/gitlab.md index 95ed6bbaa6..10d43c94b8 100644 --- a/docs/integrations/app-development/gitlab.md +++ b/docs/integrations/app-development/gitlab.md @@ -93,10 +93,8 @@ Refer to the [GitLab Webhooks documentation](https://docs.gitlab.com/ee/user/pro ### Step 3: Enable GitLab Event tagging at Sumo Logic -Sumo Logic needs to understand the event type for incoming events. To enable this, the [x-gitlab-event](https://docs.gitlab.com/ee/user/project/integrations/webhook_events.html#push-events) event type needs to be enabled. To enable this, perform the following steps in the Sumo Logic console: +Sumo Logic needs to understand the event type for incoming events. To enable this, the [x-gitlab-event](https://docs.gitlab.com/ee/user/project/integrations/webhook_events.html#push-events) event type is automatically added to the [Fields](/docs/manage/fields) during installation of the app. -1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Fields**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Fields**. You can also click the **Go To...** menu at the top of the screen and select **Fields**. -2. Add Field ‎**x-GitLab-event**‎. ## Installing the GitLab App diff --git a/docs/integrations/containers-orchestration/activemq.md b/docs/integrations/containers-orchestration/activemq.md index 9e81898cc8..e5944f948d 100644 --- a/docs/integrations/containers-orchestration/activemq.md +++ b/docs/integrations/containers-orchestration/activemq.md @@ -53,38 +53,24 @@ This App has been tested with following ActiveMQ versions: Configuring log and metric collection for the ActiveMQ App includes the following tasks: -### Step 1: Configure Fields in Sumo Logic +### Step 1: Fields in Sumo Logic -Create the following Fields in Sumo Logic prior to configuring collection. This ensures that your logs and metrics are tagged with relevant metadata, which is required by the app dashboards. For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields). +Following [fields](https://help.sumologic.com/docs/manage/fields/) will always be created automatically as a part of app installation process: - - - - -If you're using ActiveMQ in a Kubernetes environment, create the fields: * `pod_labels_component` * `pod_labels_environment` * `pod_labels_messaging_system` * `pod_labels_messaging_cluster` - - -If you're using ActiveMQ in a non-Kubernetes environment, create the fields: +If you're using ActiveMQ in a non-Kubernetes environment, these additional fields will get created automatically as a part of app installation process: * `component` * `environment` * `messaging_system` * `messaging_cluster` * `pod` - - +For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields). ### Step 2: Configure ActiveMQ Logs and Metrics Collection @@ -270,26 +256,7 @@ This section explains the steps to collect ActiveMQ logs from a Kubernetes envir ``` 5. Sumo Logic Kubernetes collection will automatically start collecting logs from the pods having the annotations defined above. -3. **Add an FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we need to create a Field Extraction Rule if not already created for Messaging Application Components. To do so: - 1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Field Extraction Rules**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Field Extraction Rules**. You can also click the **Go To...** menu at the top of the screen and select **Field Extraction Rules**. - 2. Click the + Add button on the top right of the table. - 3. The **Add Field Extraction Rule** form will appear. Enter the following options: - * **Rule Name**. Enter the name as **App Observability - Messaging**. - * **Applied At.** Choose **Ingest Time** - * **Scope**. Select **Specific Data** - * **Scope**: Enter the following keyword search expression: - ```sql - pod_labels_environment=* pod_labels_component=messaging - pod_labels_messaging_system=* pod_labels_messaging_cluster=* - ``` - * **Parse Expression**. Enter the following parse expression: - ```sql - if (!isEmpty(pod_labels_environment), pod_labels_environment, "") as environment - | pod_labels_component as component - | pod_labels_messaging_system as messaging_system - | pod_labels_messaging_cluster as messaging_cluster - ``` - +3. **FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we will have a Field Extraction Rule automatically created with named as **AppObservabilityMessagingActiveMQFER** @@ -455,247 +422,28 @@ At this point, ActiveMQ logs should start flowing into Sumo Logic. -## Installing ActiveMQ Monitors - -This section and below contain instructions for installing Sumo Logic Monitors for ActiveMQ, the app, and descriptions of each of the app dashboards. These instructions assume you have already set up the collection as described in [Collect Logs and Metrics for the ActiveMQ](#collecting-logs-and-metrics-for-activemq). +## ActiveMQ Monitors -* To install these alerts, you need to have the Manage Monitors role capability. -* Alerts can be installed by either importing a JSON file or a Terraform script. +import CreateMonitors from '../../reuse/apps/create-monitors.md'; -Sumo Logic provides out-of-the-box alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you monitor your ActiveMQ clusters. These alerts are built based on metrics and logs datasets and include preset thresholds based on industry best practices and recommendations. For details, see [ActiveMQ Alerts](#activemq-alerts). + + 10. There are limits to how many alerts can be enabled -:::note -There are limits to how many alerts can be enabled - please see the[ Alerts FAQ](/docs/alerts/monitors/monitor-faq) for details. +:::note permissions required +To install these monitors, you need to have the [Manage Monitors role capability](/docs/manage/users-roles/roles/role-capabilities/#alerting). ::: -### Method 1: Install the monitors by importing a JSON file: - -1. Download the[ JSON file](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/ActiveMQ/activemq.json) that describes the monitors. -2. The[ JSON](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/ActiveMQ/activemq.json) contains the alerts that are based on Sumo Logic searches that do not have any scope filters and therefore will be applicable to all ActiveMQ clusters, the data for which has been collected via the instructions in the previous sections. However, if you would like to restrict these alerts to specific clusters or environments, update the JSON file by replacing the text `messaging_system=activemq` with ``. Custom filter examples: - * For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=activemq-prod.01` - * For alerts applicable to all clusters that start with `activemq-prod`: `messaging_cluster=activemq-prod*` - * For alerts applicable to a specific cluster within a production environment: `messaging_cluster=activemq-1` and `environment=prod`. This assumes you have set the optional environment tag while configuring collection. -3. Go to Manage Data > Alerts > Monitors. -4. Click **Add**. -5. Click Import and then copy-paste the above JSON to import monitors. - -The monitors are disabled by default. Once you have installed the alerts using this method, navigate to the ActiveMQ folder under **Monitors** to configure them. See[ this](/docs/alerts/monitors) document to enable monitors to send notifications to teams or connections. Please see the instructions detailed in Step 4 of this [document](/docs/alerts/monitors/create-monitor). - - -### Method 2: Install the alerts using a Terraform script - -1. Generate an access key and access ID for a user that has the Manage Monitors role capability in Sumo Logic using instructions in [Access Keys](/docs/manage/security/access-keys). To find out which deployment your Sumo Logic account is in, see [Sumo Logic endpoints](/docs/api/getting-started#sumo-logic-endpoints-by-deployment-and-firewall-security). -2. [Download and install Terraform 0.13](https://www.terraform.io/downloads.html) or later. -3. Download the Sumo Logic Terraform package for ActiveMQ alerts: The alerts package is available in the Sumo Logic github[ repository](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/tree/main/monitor_packages/ActiveMQ). You can either download it through the “git clone” command or as a zip file. -4. Alert Configuration: After the package has been extracted, navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/ActiveMQ/`. - 1. Edit the `activemq.auto.tfvars` file and add the Sumo Logic Access Key, Access Id, and Deployment from Step 1. - ```bash - access_id = "" - access_key = "" - environment = "" - ``` - The Terraform script installs the alerts without any scope filters, if you would like to restrict the alerts to specific clusters or environments, update the variable `'activemq_data_source'`. Custom filter examples: - * A specific cluster `'messaging_cluster=activemq.prod.01'` - * All clusters in an environment `'environment=prod'` - * For alerts applicable to all clusters that start with activemq-prod, your custom filter would be: `'messaging_cluster=activemq-prod*'` - * For alerts applicable to a specific cluster within a production environment, your custom filter would be:`activemq_cluster=activemq-1` and `environment=prod` (This assumes you have set the optional environment tag while configuring collection) - - All monitors are disabled by default on installation, if you would like to enable all the monitors, set the parameter monitors_disabled to false in this file. - - By default, the monitors are configured in a monitor **folder** called “**ActiveMQ**”, if you would like to change the name of the folder, update the monitor folder name in “folder” key at **activemq.auto.tfvars** file. - -5. If you would like the alerts to send email or connection notifications, modify the file **activemq_notifications.auto.tfvars** and populate `connection_notifications` and `email_notifications` as per below examples. -```bash title="Pagerduty Connection Example" -connection_notifications = [ - { - connection_type = "PagerDuty", - connection_id = "", - payload_override = "{\"service_key\": \"your_pagerduty_api_integration_key\",\"event_type\": \"trigger\",\"description\": \"Alert: Triggered {{TriggerType}} for Monitor {{Name}}\",\"client\": \"Sumo Logic\",\"client_url\": \"{{QueryUrl}}\"}", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - }, - { - connection_type = "Webhook", - connection_id = "", - payload_override = "", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - } - ] -``` - -Replace `` with the connection id of the webhook connection. The webhook connection id can be retrieved by calling the[ Monitors API](https://api.sumologic.com/docs/#operation/listConnections). - -For overriding payload for different connection types, see [Set Up Webhook Connections](/docs/alerts/webhook-connections/set-up-webhook-connections). - -```bash title="Email Notifications Example" -email_notifications = [ - { - connection_type = "Email", - recipients = ["abc@example.com"], - subject = "Monitor Alert: {{TriggerType}} on {{Name}}", - time_zone = "PST", - message_body = "Triggered {{TriggerType}} Alert on {{Name}}: {{QueryURL}}", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - } - ] -``` - -6. Install the Alerts: - 1. Navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/ActiveMQ/` and run `terraform init`. This will initialize Terraform and will download the required components. - 2. Run `terraform plan` to view the monitors which will be created/modified by Terraform. - 3. Run `terraform apply`. -7. Post Installation: If you haven’t enabled alerts and/or configured notifications through the Terraform procedure outlined above, we highly recommend enabling alerts of interest and configuring each enabled alert to send notifications to other users or services. This is detailed in Step 4 of [this document](/docs/alerts/monitors/create-monitor). - -There are limits to how many alerts can be enabled. See the [Alerts FAQ](/docs/alerts/monitors/monitor-faq). - - -## Installing the ActiveMQ App - -Locate and install the app you need from the **App Catalog**. If you want to see a preview of the dashboards included with the app before installing, click **Preview Dashboards**. - -1. From the **App Catalog**, search for and select the app. -2. Select the version of the service you're using and click **Add to Library**. -3. To install the app, complete the following fields. - 1. **App Name.** You can retain the existing name, or enter a name of your choice for the app. - 2. **Data Source.** Choose **Enter a Custom Data Filter** and enter a custom ActiveMQ cluster filter. Examples: - * For all ActiveMQ clusters: `messaging_cluster=*` - * For a specific cluster: `messaging_cluster=activemq.dev.01`. - * Clusters within a specific environment: `messaging_cluster=activemq-1` and `environment=prod` (This assumes you have set the optional environment tag while configuring collection). -4. **Advanced**. Select the **Location in Library** (the default is the Personal folder in the library), or click **New Folder** to add a new folder. -5. Click **Add to Library**. - -Once an app is installed, it will appear in your **Personal** folder, or another folder that you specified. From here, you can share it with your organization. - -Panels will start to fill automatically. It's important to note that each panel slowly fills with data matching the time range query and received since the panel was created. Results won't immediately be available, but with a bit of time, you'll see full graphs and maps. - - -## ActiveMQ Alerts - -Sumo Logic has provided out-of-the-box alerts available via[ Sumo Logic monitors](/docs/alerts/monitors) to help you quickly determine if the ActiveMQ database cluster is available and performing as expected. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Alert Type (Metrics/Logs) Alert Name Alert Description Trigger Type (Critical / Warning) Alert Condition Recover Condition
Metrics ActiveMQ - High CPU Usage This alert fires when CPU usage on a node in a ActiveMQ cluster is high. Critical > = 80 < 80
Metrics ActiveMQ - High Host Disk Usage This alert fires when there is high disk usage on a node in an ActiveMQ cluster. Critical > = 80 < 80
Metrics ActiveMQ - High Memory Usage This alert fires when memory usage on a node in an ActiveMQ cluster is high. Critical > = 80 < 80
Metrics ActiveMQ - High Number of File Descriptors in use. This alert fires when the percentage of file descriptors used by a node in an ActiveMQ cluster is high. Critical > = 80 < 80
Metrics ActiveMQ - High Storage Used This alert fires when there is storage usage on a node that is high in an ActiveMQ cluster. Critical > = 80 < 80
Metrics ActiveMQ - High Temp Usage This alert fires when there is high temp usage on a node in an ActiveMQ cluster. Critical > = 80 < 80
Logs ActiveMQ - Maximum Connection This alert fires when one node in ActiveMQ cluster exceeds the maximum allowed client connection limit. Critical > = 1 < 1
Metrics ActiveMQ - No Consumers on Queues This alert fires when an ActiveMQ queue has no consumers. Critical < 1 > = 1
Metrics ActiveMQ - No Consumers on Topics This alert fires when an ActiveMQ topic has no consumers. Critical < 1 > = 1
Logs ActiveMQ - Node Down This alert fires when a node in the ActiveMQ cluster is down. Critical > = 1 < 1
Metrics ActiveMQ - Too Many Connections This alert fires when there are too many connections to a node in an ActiveMQ cluster. Critical > = 1000 < 1000
Metrics ActiveMQ - Too Many Expired Messages on Queues This alert fires when there are too many expired messages on a queue in an ActiveMQ cluster. Critical > = 1000 < 1000
Metrics ActiveMQ - Too Many Expired Messages on Topics This alert fires when there are too many expired messages on a topic in an ActiveMQ cluster. Critical > = 1000 < 1000
Metrics ActiveMQ - Too Many Unacknowledged Messages This alert fires when there are too many unacknowledged messages on a node in an ActiveMQ cluster. Critical > = 1000 < 1000
- - +### ActiveMQ alerts + +| Alert Name | Alert Description and conditions | Alert Condition | Recover Condition | +|:--|:--|:--|:--| +| `ActiveMQ - High CPU Usage Alert` | This alert gets triggered when CPU usage on a node in a ActiveMQ cluster is high. | Count >= 80 | Count < 80 | +| `ActiveMQ - High Memory Usage Alert` | This alert gets triggered when memory usage on a node in a ActiveMQ cluster is high. | Count >= 80 | Count < 80 | +| `ActiveMQ - High Storage Used Alert` | This alert gets triggered when there is high store usage on a node in a ActiveMQ cluster. | Count >= 80 | Count < 80 | +| `ActiveMQ - Maximum Connection Alert` | This alert gets triggered when one node in ActiveMQ cluster exceeds the maximum allowed client connection limit. | Count >= 1 | Count < 1 | +| `ActiveMQ - No Consumers on Queues Alert` | This alert gets triggered when a ActiveMQ queue has no consumers. | Count < 1 | Count >= 1 | +| `ActiveMQ - Node Down Alert` | This alert gets triggered when a node in the ActiveMQ cluster is down. | Count >= 1 | Count < 1 | +| `ActiveMQ - Too Many Connections Alert` | This alert gets triggered when there are too many connections to a node in a ActiveMQ cluster. | Count >= 1000 | Count < 1000 | ## Viewing the ActiveMQ Dashboards diff --git a/docs/integrations/containers-orchestration/kafka.md b/docs/integrations/containers-orchestration/kafka.md index 49aa1b000c..c0282def67 100644 --- a/docs/integrations/containers-orchestration/kafka.md +++ b/docs/integrations/containers-orchestration/kafka.md @@ -67,38 +67,24 @@ messaging_cluster=* messaging_system="kafka" \ This section provides instructions for configuring log and metric collection for the Sumo Logic App for Kafka. -### Configure Fields in Sumo Logic +### Fields in Sumo Logic -Create the following Fields in Sumo Logic prior to configuring collection. This ensures that your logs and metrics are tagged with relevant metadata, which is required by the app dashboards. For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields). +Following [fields](https://help.sumologic.com/docs/manage/fields/) will always be created automatically as a part of app installation process: - - - - -If you're using Kafka in a Kubernetes environment, create the fields: * `pod_labels_component` * `pod_labels_environment` * `pod_labels_messaging_system` * `pod_labels_messaging_cluster` - - -If you're using Kafka in a non-Kubernetes environment, create the fields: +If you're using Kafka in a non-Kubernetes environment, these additional fields will get created automatically as a part of app installation process: * `component` * `environment` * `messaging_system` * `messaging_cluster` * `pod` - - +For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields). ### Configure Collection for Kafka @@ -230,30 +216,7 @@ This section explains the steps to collect Kafka logs from a Kubernetes environm kubectl describe pod ``` 5. Sumo Logic Kubernetes collection will automatically start collecting logs from the pods having the annotations defined above. -3. **Add an FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we need to create a Field Extraction Rule if not already created for Messaging Application Components. To do so: - 1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Logs > Field Extraction Rules**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the top menu select **Configuration**, and then under **Logs** select **Field Extraction Rules**. You can also click the **Go To...** menu at the top of the screen and select **Field Extraction Rules**. - 2. Click the **+ Add** button on the top right of the table. - 3. The **Add Field Extraction Rule** form will appear. Enter the following options: - * **Rule Name**. Enter the name as **App Component Observability - Messaging.** - * **Applied At**. Choose Ingest Time - * **Scope**. Select Specific Data - * Scope: Enter the following keyword search expression: - ```sql - pod_labels_environment=* pod_labels_component=messaging - pod_labels_messaging_system=kafka pod_labels_messaging_cluster=* - ``` - * **Parse Expression**. Enter the following parse expression: - ```sql - if (!isEmpty(pod_labels_environment), pod_labels_environment, "") as environment - | pod_labels_component as component - | pod_labels_messaging_system as messaging_system - | pod_labels_messaging_cluster as messaging_cluster - ``` - 4. Click **Save** to create the rule. - 5. Verify logs are flowing into Sumo Logic by running the following logs query: - ```sql - component="messaging" and messaging_system="kafka" - ``` +3. **FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we will have a Field Extraction Rule automatically created with named as **AppObservabilityMessagingKafkaFER**
@@ -390,93 +353,6 @@ At this point, Kafka metrics and logs should start flowing into Sumo Logic. -## Installing Kafka Alerts - -This section and below provide instructions for installing the Sumo App and Alerts for Kafka and descriptions of each of the app dashboards. These instructions assume you have already set up the collection as described in [Collect Logs and Metrics for Kafka](#collecting-logs-and-metrics-for-kafka). - -#### Pre-Packaged Alerts - -Sumo Logic has provided out-of-the-box alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you quickly determine if the Kafka cluster is available and performing as expected. These alerts are built based on metrics datasets and have preset thresholds based on industry best practices and recommendations. See [Kafka Alerts](#kafka-alerts) for more details. - -* To install these alerts, you need to have the Manage Monitors role capability. -* Alerts can be installed by either importing a JSON or a Terraform script. -* There are limits to how many alerts can be enabled - see the [Alerts FAQ](/docs/alerts/monitors/monitor-faq) for details. - - -### Method A: Importing a JSON file - -1. Download a[ JSON file](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/kubernetes/kubernetes.json) that describes the monitors. - 1. The [JSON](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/Kafka/Kafka_Alerts.json) contains the alerts that are based on Sumo Logic searches that do not have any scope filters and therefore will be applicable to all Kafka clusters, the data for which has been collected via the instructions in the previous sections. However, if you would like to restrict these alerts to specific clusters or environments, update the JSON file by replacing the text `'messaging_system=kafka `with `'`. Custom filter examples: - * For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=Kafka-prod.01` - * For alerts applicable to all clusters that start with Kafka-prod, your custom filter would be: `messaging_cluster=Kafka-prod*` - * For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=Kafka-1` and `environment=prod` (This assumes you have set the optional environment tag while configuring collection) - 2. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Monitoring > Monitors**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the main Sumo Logic menu, select **Alerts > Monitors**. You can also click the **Go To...** menu at the top of the screen and select **Monitors**. - 3. Click **Add** - 4. Click Import to import monitors from the JSON above. - -The monitors are disabled by default. Once you have installed the alerts using this method, navigate to the Kafka folder under Monitors to configure them. See [this](/docs/alerts/monitors) document to enable monitors. To send notifications to teams or connections, see the instructions detailed in Step 4 of this [document](/docs/alerts/monitors/create-monitor). - -### Method B: Using a Terraform script - -1. Generate an access key and access ID for a user that has the Manage Monitors role capability in Sumo Logic using instructions in [Access Keys](/docs/manage/security/access-keys). Identify which deployment your Sumo Logic account is in using [this link](/docs/api/getting-started#sumo-logic-endpoints-by-deployment-and-firewall-security). -2. [Download and install Terraform 0.13](https://www.terraform.io/downloads.html) or later. -3. Download the Sumo Logic Terraform package for Kafka alerts. The alerts package is available in the Sumo Logic [GitHub repository](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/tree/main/monitor_packages/Kafka). You can either download it through the “git clone” command or as a zip file. -4. Alert Configuration. After the package has been extracted, navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/Kafka`. - 1. Edit the `monitor.auto.tfvars` file and add the Sumo Logic Access Key, Access Id and Deployment from Step 1. - ```bash - access_id = "" - access_key = "" - environment = "" - ``` - 2. The Terraform script installs the alerts without any scope filters, if you would like to restrict the alerts to specific clusters or environments, update the variable `’kafka_data_source’`. Custom filter examples: - * For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=Kafka-prod.01` - * For alerts applicable to all clusters that start with Kafka-prod, your custom filter would be: `messaging_cluster=Kafka-prod*` - * For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=Kafka-1` and `environment=prod`. This assumes you have set the optional environment tag while configuring collection. - -All monitors are disabled by default on installation, if you would like to enable all the monitors, set the parameter `monitors_disabled` to `false` in this file. - -By default, the monitors are configured in a monitor folder called “Kafka”, if you would like to change the name of the folder, update the monitor folder name in this file. - -5. To send email or connection notifications, modify the file `notifications.auto.tfvars` file and fill in the `connection_notifications` and `email_notifications` sections. See the examples for PagerDuty and email notifications below. See [this document](/docs/alerts/webhook-connections/set-up-webhook-connections) for creating payloads with other connection types. - -```bash title="Pagerduty Connection Example" -connection_notifications = [ - { - connection_type = "PagerDuty", - connection_id = "", - payload_override = "{\"service_key\": \"your_pagerduty_api_integration_key\",\"event_type\": \"trigger\",\"description\": \"Alert: Triggered {{TriggerType}} for Monitor {{Name}}\",\"client\": \"Sumo Logic\",\"client_url\": \"{{QueryUrl}}\"}", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - }, - { - connection_type = "Webhook", - connection_id = "", - payload_override = "", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - } - ] -``` - -Replace `` with the connection id of the webhook connection. The webhook connection id can be retrieved by calling the[ Monitors API](https://api.sumologic.com/docs/#operation/listConnections). - -```bash title="Email Notifications Example" -email_notifications = [ - { - connection_type = "Email", - recipients = ["abc@example.com"], - subject = "Monitor Alert: {{TriggerType}} on {{Name}}", - time_zone = "PST", - message_body = "Triggered {{TriggerType}} Alert on {{Name}}: {{QueryURL}}", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - } - ] -``` - -6. Install the Alerts - 1. Navigate to the package directory `terraform-sumologic-sumo-logic-monitor/monitor_packages/Kafka/` and run terraform init. This will initialize Terraform and will download the required components. - 2. Run `terraform plan` to view the monitors which will be created/modified by Terraform. - 3. Run `terraform apply`. -7. **Post Installation.** If you haven’t enabled alerts and/or configured notifications through the Terraform procedure outlined above, we highly recommend enabling alerts of interest and configuring each enabled alert to send notifications to other people or services. This is detailed in Step 4 of[ this document](/docs/alerts/monitors/create-monitor). - ## Installing the Kafka App @@ -726,6 +602,16 @@ Use this dashboard to: ## Kafka Alerts +#### Pre-Packaged Alerts + +Sumo Logic has provided out-of-the-box alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you quickly determine if the Kafka cluster is available and performing as expected. These alerts are built based on metrics datasets and have preset thresholds based on industry best practices and recommendations. + + +* There are limits to how many alerts can be enabled - see the [Alerts FAQ](/docs/alerts/monitors/monitor-faq) for details. +:::note permissions required +To install these alerts, you need to have the [Manage Monitors role capability](/docs/manage/users-roles/roles/role-capabilities/#alerting). +::: + | Alert Name | Alert Description and conditions | Alert Condition | Recover Condition | |:---------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------|:-------------------| | Kafka - High Broker Disk Utilization | This alert fires when we detect that a disk on a broker node is more than 85% full. | `>=`85 | < 85 | diff --git a/docs/integrations/containers-orchestration/rabbitmq.md b/docs/integrations/containers-orchestration/rabbitmq.md index a53acbcc89..eda1a01122 100644 --- a/docs/integrations/containers-orchestration/rabbitmq.md +++ b/docs/integrations/containers-orchestration/rabbitmq.md @@ -51,38 +51,26 @@ Host: broker-1 Name: /var/log/rabbitmq/rabbit.log Category: logfile This section provides instructions for configuring log and metric collection for the Sumo Logic App for RabbitMQ. -### Step 1: Configure Fields in Sumo Logic +### Step 1: Fields in Sumo Logic -Create the following Fields in Sumo Logic prior to configuring collection. This ensures that your logs and metrics are tagged with relevant metadata, which is required by the app dashboards. For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields). +Following [fields](https://help.sumologic.com/docs/manage/fields/) will always be created automatically as a part of app installation process: - +* `pod_labels_component` +* `pod_labels_environment` +* `pod_labels_messaging_system` +* `pod_labels_messaging_cluster` - -If you're using RabbitMQ in a Kubernetes environment, create the fields: -* pod_labels_component -* pod_labels_environment -* pod_labels_messaging_system -* pod_labels_messaging_cluster +If you're using RabbitMQ in a non-Kubernetes environment, these additional fields will get created automatically as a part of app installation process: +* `component` +* `environment` +* `messaging_system` +* `messaging_cluster` +* `pod` - - +For information on setting up fields, see [Sumo Logic Fields](/docs/manage/fields). -If you're using RabbitMQ in a non-Kubernetes environment, create the fields: -* component -* environment -* messaging_system -* messaging_cluster -* pod - - ### Step 2: Configure Collection for RabbitMQ @@ -211,26 +199,7 @@ For all other parameters see [this doc](/docs/send-data/collect-from-other-data- kubectl describe pod ``` 5. Sumo Logic Kubernetes collection will automatically start collecting logs from the pods having the annotations defined above. -3. **Add an FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we need to create a Field Extraction Rule if not already created for Messaging Application Components. To do so: - 1. Go to **Manage Data > Logs > Field Extraction Rules**. - 2. Click the + Add button on the top right of the table. - 3. The **Add Field Extraction Rule** form will appear: - 4. Enter the following options: - * **Rule Name**. Enter the name as **App Observability - Messaging**. - * **Applied At.** Choose **Ingest Time** - * **Scope**. Select **Specific Data** - * **Scope**: Enter the following keyword search expression: - ```sql - pod_labels_environment=* pod_labels_component=messaging pod_labels_messaging_system=* pod_labels_messaging_cluster=* - ``` - * **Parse Expression**.Enter the following parse expression: - ```sql - | if (!isEmpty(pod_labels_environment), pod_labels_environment, "") as environment - | pod_labels_component as component - | pod_labels_messaging_system as messaging_system - | pod_labels_messaging_cluster as messaging_cluster - ``` - 5. Click **Save** to create the rule. +3. **FER to normalize the fields in Kubernetes environments**. Labels created in Kubernetes environments automatically are prefixed with `pod_labels`. To normalize these for our app to work, we will have a Field Extraction Rule automatically created with named as **AppObservabilityMessagingRabbitMQFER** @@ -361,98 +330,15 @@ At this point, RabbitMQ logs should start flowing into Sumo Logic. -## Installing Monitors - -These instructions assume you have already set up collection as described in the [Collect Logs and Metrics for RabbitMQ](#collecting-logs-and-metrics-for-rabbitmq). - -Sumo Logic has provided pre-packaged alerts available through [Sumo Logic monitors](/docs/alerts/monitors) to help you proactively determine if a RabbitMQ cluster is available and performing as expected. These monitors are based on metric and log data and include pre-set thresholds that reflect industry best practices and recommendations. For more information about individual alerts, see [RabbitMQ Alerts](#rabbitmq-alerts). - -To install these monitors, you must have the **Manage Monitors** role capability. - -You can install monitors by importing a JSON file or using a Terraform script. - -There are limits to how many alerts can be enabled. For more information, see [Monitors](/docs/alerts/monitors/create-monitor) for details. - - -#### Method A: Install Monitors by importing a JSON file - -1. Download the [JSON file](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/RabbitMQ/rabbitmq.json) that describes the monitors. -2. The [JSON](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/blob/main/monitor_packages/RabbitMQ/rabbitmq.json) contains the alerts that are based on Sumo Logic searches that do not have any scope filters and therefore will be applicable to all RabbitMQ clusters, the data for which has been collected via the instructions in the previous sections. However, if you would like to restrict these alerts to specific clusters or environments, update the JSON file by replacing the text `messaging_cluster=*` with ``. Custom filter examples: - * For alerts applicable only to a specific cluster, your custom filter would be: `messaging_cluster=dev-rabbitmq01` - * For alerts applicable to all clusters that start with RabbitMQ-prod, your custom filter would be: `messaging_cluster=RabbitMQ-prod*` - * For alerts applicable to a specific cluster within a production environment, your custom filter would be: `messaging_cluster=dev-rabbitmq01 AND environment=prod` (This assumes you have set the optional environment tag while configuring collection) -3. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Monitoring > Monitors**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the main Sumo Logic menu, select **Alerts > Monitors**. You can also click the **Go To...** menu at the top of the screen and select **Monitors**. -4. Click **Add**. -5. Click **Import**. -6. On the **Import Content popup**, enter **RabbitMQ** in the Name field, paste in the JSON into the the popup, and click **Import**. -7. The monitors are created in a "RabbitMQ" folder. The monitors are disabled by default. See the [Monitors](/docs/alerts/monitors) topic for information about enabling monitors and configuring notifications or connections. - -#### Method B: Install Monitors using a Terraform script - -1. Generate an access key and access ID for a user that has the **Manage Monitors** role capability. For instructions see [Access Keys](/docs/manage/security/access-keys). -2. Download [Terraform 0.13](https://www.terraform.io/downloads.html) or later, and install it. -3. Download the Sumo Logic Terraform package for MySQL monitors: The alerts package is available in the Sumo Logic GitHub [repository](https://github.com/SumoLogic/terraform-sumologic-sumo-logic-monitor/tree/main/monitor_packages/mysql). You can either download it using the git clone command or as a zip file. -4. Alert Configuration: After extracting the package, navigate to the terraform-sumologic-sumo-logic-monitor/monitor_packages/RabbitMQ/ directory. - -Edit the rabbitmq.auto.tfvars file and add the Sumo Logic Access Key and Access ID from Step 1 and your Sumo Logic deployment. If you're not sure of your deployment, see [Sumo Logic Endpoints and Firewall Security](/docs/api/getting-started#sumo-logic-endpoints-by-deployment-and-firewall-security). -```bash -access_id = "" -access_key = "" -environment = "" -``` - -The Terraform script installs the alerts without any scope filters, if you would like to restrict the alerts to specific clusters or environments, update the `rabbitmq_data_source` variable. For example: -* To configure alerts for A specific cluster, set `rabbitmq_data_source` to something like: messaging_cluster=rabbitmq.prod.01 -* To configure alerts for All clusters in an environment, set `rabbitmq_data_source` to something like: environment=prod -* To configure alerts for Multiple clusters using a wildcard, set `rabbitmq_data_source` to something like: `messaging_cluster=rabbitmq-prod*` -* To configure alerts for A specific cluster within a specific environment, set `rabbitmq_data_source` to something like: `messaging_cluster=rabbitmq-1 and environment=prod`. This assumes you have configured and applied Fields as described in Step 1: Configure Fields of the Sumo Logic of the Collect Logs and Metrics for RabbitMQ. - -All monitors are disabled by default on installation. To enable all of the monitors, set the monitors_disabled parameter to false. - -By default, the monitors will be located in a "RabbitMQ" folder on the **Monitors** page. To change the name of the folder, update the monitor folder name in the folder variable in the rabbitmq.auto.tfvars file. - -5. If you want the alerts to send email or connection notifications, edit the `rabbitmq_notifications.auto.tfvars` file to populate the `connection_notifications` and `email_notifications` sections. Examples are provided below. - -In the variable definition below, replace `` with the connection ID of the Webhook connection. You can obtain the Webhook connection ID by calling the [Monitors API](https://api.sumologic.com/docs/#operation/listConnections). - -```bash title="Pagerduty connection example" -connection_notifications = [ - { - connection_type = "PagerDuty", - connection_id = "", - payload_override = "{\"service_key\": \"your_pagerduty_api_integration_key\",\"event_type\": \"trigger\",\"description\": \"Alert: Triggered {{TriggerType}} for Monitor {{Name}}\",\"client\": \"Sumo Logic\",\"client_url\": \"{{QueryUrl}}\"}", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - }, - { - connection_type = "Webhook", - connection_id = "", - payload_override = "", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - } - ] -``` - -For information about overriding the payload for different connection types, see [Set Up Webhook Connections](/docs/alerts/webhook-connections/set-up-webhook-connections). - -```bash title="Email notifications example" -email_notifications = [ - { - connection_type = "Email", - recipients = ["abc@example.com"], - subject = "Monitor Alert: {{TriggerType}} on {{Name}}", - time_zone = "PST", - message_body = "Triggered {{TriggerType}} Alert on {{Name}}: {{QueryURL}}", - run_for_trigger_types = ["Critical", "ResolvedCritical"] - } - ] -``` - -6. Install Monitors: - 1. Navigate to the `terraform-sumologic-sumo-logic-monitor/monitor_packages/rabbitmq/` directory and run terraform init. This will initialize Terraform and download the required components. - 2. Run `terraform plan` to view the monitors that Terraform will create or modify. - 3. Run `terraform apply`. +## RabbitMQ Monitors +import CreateMonitors from '../../reuse/apps/create-monitors.md'; + + 10. There are limits to how many alerts can be enabled +:::note permissions required +To install these monitors, you need to have the [Manage Monitors role capability](/docs/manage/users-roles/roles/role-capabilities/#alerting). +::: ## Installing the RabbitMQ App This section demonstrates how to install the RabbitMQ App.