diff --git a/cid-redirects.json b/cid-redirects.json
index 5316204735..65836e7b65 100644
--- a/cid-redirects.json
+++ b/cid-redirects.json
@@ -2892,7 +2892,8 @@
"/Cloud_SIEM_Enterprise/CSE_Schema/Field_Mapping_for_Security_Event_Sources": "/docs/cse/schema/field-mapping-security-event-sources",
"/Cloud_SIEM_Enterprise/CSE_Schema/Parser_Editor": "/docs/cse/schema/parser-editor",
"/docs/send-data/parse-data/parser-editor": "/docs/cse/schema/parser-editor",
- "/Cloud_SIEM_Enterprise/CSE_Schema/Parser_Editor/Parser_Troubleshooting_Tips": "/docs/cse/schema/parser-troubleshooting-tips",
+ "/Cloud_SIEM_Enterprise/CSE_Schema/Parser_Editor/Parser_Troubleshooting_Tips": "/docs/cse/troubleshoot/troubleshoot-parsers",
+ "/docs/cse/schema/parser-troubleshooting-tips": "/docs/cse/troubleshoot/troubleshoot-parsers",
"/Cloud_SIEM_Enterprise/CSE_Schema/Username_and_Hostname_Normalization": "/docs/cse/schema/username-and-hostname-normalization",
"/Cloud_SIEM_Enterprise/CSE_Sensors": "/docs/cse/sensors",
"/Cloud_SIEM_Enterprise/CSE_Sensors/01_Sensor_Download_Locations": "/docs/cse/sensors/sensor-download-locations",
diff --git a/docs/cse/schema/index.md b/docs/cse/schema/index.md
index 42e8c71b94..fc70a27384 100644
--- a/docs/cse/schema/index.md
+++ b/docs/cse/schema/index.md
@@ -69,10 +69,4 @@ This guide has information about Cloud SIEM schemas. In this section, we'll intr
Learn how to import YARA rules from GitHub into Cloud SIEM.
-
diff --git a/docs/cse/schema/parser-editor.md b/docs/cse/schema/parser-editor.md
index 257e14520d..11262fc5ab 100644
--- a/docs/cse/schema/parser-editor.md
+++ b/docs/cse/schema/parser-editor.md
@@ -8,14 +8,14 @@ description: Learn how to use the Parser Editor to configure and test a custom p
import useBaseUrl from '@docusaurus/useBaseUrl';
import Iframe from 'react-iframe';
-This topic has instructions for using the Sumo Logic parser editor. You can use the editor to customize system parsers, and to create your own custom parsers. We provide [parser templates](#parser-templates) that you can use as a starting point for creating custom parsers.
+This article has instructions for using the Sumo Logic parser editor. You can use the editor to customize system parsers, and to create your own custom parsers. We provide [parser templates](#parser-templates) that you can use as a starting point for creating custom parsers.
For a complete list of standard parsers, see [Parsers](https://github.com/SumoLogic/cloud-siem-content-catalog/blob/master/parsers/README.md) in the [Cloud SIEM Content Catalog](https://github.com/SumoLogic/cloud-siem-content-catalog/blob/master/README.md).
See additional articles for more information about the Sumo Logic Cloud SIEM parsers:
* [Parsing Language Reference Guide](/docs/cse/schema/parsing-language-reference-guide)
* [Parsing Patterns](/docs/cse/schema/parsing-patterns)
-* [Parser Troubleshooting](/docs/cse/schema/parser-troubleshooting-tips)
+* [Troubleshoot Parsers](/docs/cse/troubleshoot/troubleshoot-parsers)
:::note
The instructions that follow assume that you have already written your parser code.
diff --git a/docs/cse/schema/parser-troubleshooting-tips.md b/docs/cse/schema/parser-troubleshooting-tips.md
deleted file mode 100644
index 2b100fd928..0000000000
--- a/docs/cse/schema/parser-troubleshooting-tips.md
+++ /dev/null
@@ -1,28 +0,0 @@
----
-id: parser-troubleshooting-tips
-title: Parser Troubleshooting Tips
-sidebar_label: Parser Troubleshooting
-description: Learn how to troubleshoot problems with parsers.
----
-
-
-Sumo Logic parsers are a powerful tool for extracting log data to support security and observability use cases. This topic provides tips to help you identify and resolve some common issues you might encounter when using parsers.
-
-For general information on the parsing engine and syntax, see the [Parser Editor](/docs/cse/schema/parser-editor) and [Parsing Language Reference Guide](/docs/cse/schema/parsing-language-reference-guide) topics.
-
-1. Our [Ingestion Guides](/docs/cse/ingestion/) provide instructions for how to ingest data from a variety of data sources. Check to see if there is a guide for the data source you’re working with. The ingest guides generally describe the most straightforward, least error-prone method. Make sure that you’ve followed the instructions exactly and that the data to be ingested is supported.
-
- These guides explain how to configure Collectors and Sources to use a specific parser, what messages are supported out-of-the-box, and have links to vendor documentation where appropriate.
-
- For data sources that can be configured to log in a custom format, such as [Palo Alto Firewall](/docs/cse/ingestion/ingestion-sources-for-cloud-siem/palo-alto-firewall), the ingest guide will define what formats are supported. Support is usually limited to default configurations, but may vary.
-2. The Sumo Logic Collector or Source that sends the data to be parsed must be correctly configured with the path to the parser. Make sure the path you assign to the Collector or Source is exactly correct. A single character difference will result in parser errors for all logs you try to ingest from your data source. The path to a parser looks like this:
-
- `/Parsers/System/Microsoft/Windows-XML`
-
- The ingest guide for a data source will include the path to the correct parser. You can also determine the path to a parser on the **Logs > Parsers** page in the Sumo Logic UI: navigate to the parser, and then choose **Copy Path** from the three-dot kebab menu.
-3. Check for Field Extraction Rules, [Sumo Logic Ingest Mappings](/docs/cse/ingestion/sumo-logic-ingest-mapping), or [Local Configurations](/docs/cse/schema/parser-editor#create-a-local-configuration-for-a-system-parser) related to the parser that is presenting issues.
-
- * Field Extraction Rules can alter message contents in such a way that the parser works when you're testing it in the Parser Editor against messages returned by a Sumo Logic log search, but not when it receives logs from the Sumo Logic source that collected the logs. Replicating the logic of the FER in a Local Configuration in the parser usually solves this problem.
- * Sumo Logic Ingest Mappings for a data source should always be disabled when you’ve configured a Sumo Logic parser for that same data source. Otherwise, a single message might result in multiple Cloud SIEM records.
- * A Local Configuration to a parser is an override to out-of-the-box behavior. For this reason, if you’re having trouble with a parser, checking out any Local Configurations is important. Make sure to test the parser without Local Configurations so you can verify whether the problem is with the parser itself, or related to an external factor.
-4. Use the right parser for your data format. Some data sources, for example, Windows Event Logs, can send data in multiple different formats and using the correct parser for the format in use is required.
diff --git a/docs/cse/troubleshoot/index.md b/docs/cse/troubleshoot/index.md
new file mode 100644
index 0000000000..4f7b92e7c7
--- /dev/null
+++ b/docs/cse/troubleshoot/index.md
@@ -0,0 +1,24 @@
+---
+slug: /cse/troubleshoot
+title: Troubleshoot Cloud SIEM
+description: Learn how to troubleshoot problems with Cloud SIEM.
+---
+
+import useBaseUrl from '@docusaurus/useBaseUrl';
+
+This section contains articles to help you troubleshoot problems with Cloud SIEM.
+
+
diff --git a/docs/cse/troubleshoot/troubleshoot-mappers.md b/docs/cse/troubleshoot/troubleshoot-mappers.md
new file mode 100644
index 0000000000..cf901fe340
--- /dev/null
+++ b/docs/cse/troubleshoot/troubleshoot-mappers.md
@@ -0,0 +1,167 @@
+---
+id: troubleshoot-mappers
+title: Troubleshoot Mappers
+sidebar_label: Mappers
+description: Learn how to troubleshoot problems with log mappers.
+---
+
+import useBaseUrl from '@docusaurus/useBaseUrl';
+
+This article provides guidance for administrators to diagnose, troubleshoot, and escalate issues with Sumo Logic Cloud SIEM log mappers.
+
+Mappers are a critical component in the Cloud SIEM data ingestion pipeline. They serve as the second step in transforming raw log messages into structured records that can be used for threat detection and security analysis. Specifically, mappers:
+* Take key-value pairs created during parsing and map them into the Cloud SIEM normalization schema.
+* Assign classification to each log coming into Cloud SIEM.
+* Determine the entities present in the record.
+* Support the creation of high-fidelity detection rules.
+
+For information about creating log mappers, see [Create a Structured Log Mapping](/docs/cse/schema/create-structured-log-mapping/). For more general information about log mapping, and how it fits into the record creation process, see the [Record Processing Pipeline](/docs/cse/schema/record-processing-pipeline) topic. For a complete list of the standard log mappings, see [Mappings](https://github.com/SumoLogic/cloud-siem-content-catalog/blob/master/mappings/README.md) in the [Cloud SIEM Content Catalog](https://github.com/SumoLogic/cloud-siem-content-catalog/blob/master/README.md).
+
+## Interpreting record failures and issues
+
+### Failed Records dashboard
+
+The [Enterprise Audit - Cloud SIEM app](/docs/integrations/sumo-apps/cse/) provides dashboards and queries for greater visibility into Cloud SIEM activity. Troubleshooting parser failures is aided by the [Cloud SIEM - Record Analysis - Failed Records](/docs/integrations/sumo-apps/cse/#record-analysis-failed-records) dashboard and query found within the app. (The Enterprise Audit - Cloud SIEM app must be installed).
+
+Common failure types:
+* **Parser failures**. Include parser path and specific parsing error.
+* **Mapper failures**. Usually mention mapper or mapping issues.
+* **Mixed failures**. May indicate parser output doesn't match mapper expectations.
+
+### Mapping failure workflow
+
+Failure can occur when no mapping matches `logType=json`, `vendor=` set in the parser, `product=` set in the parser, or `eventId=` set in the parser.
+
+#### Failed records
+
+Failed records result when a mapper doesn’t match with the metadata set during log parsing:
+* **Vendor**. No mapper exists with the specified vendor metadata defined in the parser. This is likely a mismatch for a custom (non-out-of-the-box) parser where a parser has been created without a corresponding mapper(s).
+* **Product**. No mapper exists with the specified product metadata defined in the parser.
+ * Depending on the parser, product may be dynamic. Dynamic product parsers support multiple products from a vendor in a single parser.
+ * A mapper which corresponds to the product metadata coming from the parser may be missing.
+* **EventId**. No mapper exists which matches the pattern of the `event_id` set in the parser. This is the most common failure. This occurs frequently when an event type or types occurs which is not anticipated in the mapper (or the parser as the case may be) and can be due to:
+ * New event types from a vendor.
+ * Previously unseen event types.
+ * Unsupported events.
+ * Events which do not have security relevance.
+ * Events which do not contain an entity and therefore cannot contribute to signals and insights.
+ * Events in an unsupported format
+* **Log Type**. For logs processed by parsers, `logType` is always JSON as the output of parsed logs is key value pairs stored in a JSON object. This is the case regardless of the original format the logs were ingested as (CEF, LEEF, XML, and so on).
+
+#### Incomplete records
+
+Incomplete records result when a record is created, but key information is not mapped. This can occur for several reasons:
+* Fields expected in the mapper are not present in the parsed log.
+ * Vendor changes to key labels or their location.
+ * Configuration change.
+* The parsed log is not parsed correctly. The parser is not providing key value pairs that are expected by the mapper due to a flaw in the parser.
+* Fields parsed are not mapped.
+ * Only a catch-all or default mapper is present and doesn’t comprehensively normalize the log. Mappers using the _default_ pattern exist where possible to ensure some minimal normalization and classification occurs.
+ * There is no appropriate schema field to normalize to. The Cloud SIEM records schema is limited, for fields which there is no corresponding normalized field, the original key value pairs extracted during parser which can be referenced in rules and records searches.
+
+## Investigating and resolving mapper failures
+
+### A mapper does not exist for parsed events
+
+Another common mapping issue occurs when a log successfully parses, is assigned mapping metadata (`vendor`, `product`, `event_id`), but there is no corresponding mapper for the `event_id`. In many cases a _default_ pattern mapper exists which serves as a catch-all, but if this is not present, any logs which do not match a pattern defined in the mapper input will not create a record.
+
+This assumes the data source for which the mapping is failing has a parser and mappers in place already. This applies to net-new data sources and parsers.
+
+#### Failed record example
+
+For this example, let's assume no mapping matched `logType=json`, `vendor=microsoft`, `product=azure`, and `eventId=AzureDevOpsAuditEvent`.
+
+#### Troubleshooting workflow
+
+1. Determine how mapper metadata is assigned during log parsing.
+ 1. From the failed record:
+ 1. Get the value of `metadata_parser`.
+ 1. Get value of `metadata_sourceMessageId`.
+ 1. Get the value of from the reason field to determine:
+ * Vendor
+ * Product
+ * EventId
+ 1. Load the parser found above in `metadata_parser` in step 1.
+ 1. In the parser UI import messages from Sumo Logic log search using this query:
+ ```
+ messageId=
+ ```
+ 1. Parse messages. Use the parsed [field dictionary](/docs/cse/schema/parsing-language-reference-guide/) to determine what the input values of any mapper creation or change will be.
+1. Examine existing mappers from the [Log Mappings](/docs/cse/schema/create-structured-log-mapping/) page in Cloud SIEM:
+ 1. Filter for the output vendor and product gathered in step 1.
+ 1. Note the event ID patterns present.
+ 1. You can do one of the following:
+ * If an existing pattern exists which is similar to the event ID captured in step 1 above, the mapper may be able to be adapted to accommodate the variation. Care should be taken to ensure the input fields in any adapted mapper match with the parsed output of the previously unmapped log. Out-of-the-box (system) mappers will need to be duplicated to be modified.
+ * An existing mapper may be used as a template if appropriate to map the previously unmapped log with the identified pattern.
+ * A new structured mapper can be created using the same input/output vendor and product info with JSON as the format (all parsed logs are output as JSON regardless of the original raw format) with the identified event ID pattern
+
+### A field is not mapped (normalized) in a record
+
+A common mapping issue is where a record is created but certain fields are not normalized and are potentially causing downstream issues, such as false-negative or false-positive signals, or causing searches based on records to fail in some way.
+
+1. Analyze the record.
+ 1. Determine what field(s) are missing. For example, `user_username` is not mapped, preventing the record from being considered for signals and correlation into insights.
+ 1. Determine where does the data exist in the parsed log. Use the "fields" element in the record, which contains all parsed key value pairs from the original log, to find the location where the field needs to be mapped from. Note the key name.
+1. Analyze and correct the mapper.
+ 1. Determine which mapper a record is mapped from using `metadata_mapperName` or `metadata_mapperUid`.
+ 1. Determine how the mapper maps the desired field. Look in the mapper input column to determine which value(s) the mapper uses to map the field.
+1. Modify the mapper.
+ 1. For OOTB (system) mappers, to make local changes, the mapper must be duplicated (disables the duplicated mapper).
+ 1. Assuming there is no obvious error in the original mapper, such as a misspelling of the input field, add the field determined in record analysis to the "Alt. Input Fields" for the desired mapped field. Alternate Input Fields are tried in this order:
+ * If the primary input field is missing, each alternate is tried until a match is found.
+ * If there are multiple matches in the list, only the first is considered.
+1. Validate the change. New mapped records should reflect the modification quickly.
+A search for records using the modified/new mapper can be initiated from the log mappings page.
+ 1. Open the [log mapping](/docs/cse/schema/create-structured-log-mapping/).
+ 1. Select **Actions > Open in Record Search**.
+ 1. Search records for the log mapping. Previously mapped records will not reflect mapper changes made after they were created.
+
+## Escalate mapper issues
+
+Sumo Logic Threat Labs Detection Engineering maintains all out-of-the-box Cloud SIEM content. Content includes parsers, mappers, rules, and normalization schema. For details about this content, see the [Cloud SIEM Content Catalog](/docs/cse/get-started-with-cloud-siem/cloud-siem-content-catalog/).
+
+Upon identifying an issue with a Cloud SIEM out-of-the-box parser using this article, it may be necessary to [contact Support](https://support.sumologic.com/support/s/) to escalate the issue to the Threat Labs team.
+
+### Escalation requirements
+
+Provide the following:
+* A concise description of the problem.
+ * Screenshots are helpful for understanding current and potentially previous behavior if there was a change.
+ * Impacted mapper(s) (if applicable), including mapper name and or mapper UID (`metadata_mapperName`/`metadata_mapperUid`).
+* A representative raw log sample.
+ * Include logs which represent the issue.
+ * For new support or extension of an already supported source, include logs which represent the types of (possibly new) events.
+* A security use case if it is not immediately obvious. Not all logs necessarily will be useful in Cloud SIEM
+* Supporting documentation. Esoteric or poorly labelled values may require documentation which is not always publicly available.
+* Configuration information.
+ * Many data sources will have options for configuring logging.
+ * It is important to understand what those settings are to address issues or develop new global support for a data source or offer advice for a custom solution if a global one is not appropriate.
+
+### Gather raw samples
+
+Prior to opening a support request, it is helpful to gather sample raw logs (without field extraction rules overwriting `_raw`) which represent the identified issue.
+
+Once a representative sample has been gathered it is recommended to export it as a CSV from Sumo Logic search to ensure no extraneous formatting is applied that might confound further troubleshooting by Sumo Logic Customer Success and Threat Labs teams.
+
+Following are some ways to gather samples.
+
+#### Search for message IDs
+
+Gather a sample by searching for the map logs that are failing using the identified `_messageId`(s):
+```
+_sourceCategory=
+| where _messageId in (,,)
+```
+
+#### Identify an event ID
+
+If a particular event ID or IDs as set in the parser using `MAPPER:event_id` are failing to map, gather the failing event IDs. Construct a search query using the failing event IDs and gather unique examples of each _raw per event ID.
+```
+_sourceCategory=your/source/category
+| fields eventID
+| where eventID in ("list","of","eventIDs")
+| first(_raw) by eventID
+```
+The first field will be `_raw` log.
+
+This query only surfaces the first example within the time range to cut down on duplication. It may be necessary to gather multiple examples if there is meaningful variation within each failing sample that may require further mapping changes.
\ No newline at end of file
diff --git a/docs/cse/troubleshoot/troubleshoot-parsers.md b/docs/cse/troubleshoot/troubleshoot-parsers.md
new file mode 100644
index 0000000000..c34b623b04
--- /dev/null
+++ b/docs/cse/troubleshoot/troubleshoot-parsers.md
@@ -0,0 +1,252 @@
+---
+id: troubleshoot-parsers
+title: Troubleshoot Parsers
+sidebar_label: Parsers
+description: Learn how to troubleshoot problems with log parsers.
+---
+
+import useBaseUrl from '@docusaurus/useBaseUrl';
+
+This article provides guidance for administrators to diagnose, troubleshoot, and escalate issues with Sumo Logic Cloud SIEM log parsers. For general information on the parsing engine and syntax, see [Parser Editor](/docs/cse/schema/parser-editor) and [Parsing Language Reference Guide](/docs/cse/schema/parsing-language-reference-guide).
+
+Parsers are a critical component in the Cloud SIEM data ingestion pipeline. They serve as the first step in transforming raw log messages into structured records that can be used for threat detection and security analysis. Specifically, parsers:
+* Extract key-value pairs from raw log messages.
+* Enable proper log mapping to Cloud SIEM schema attributes by:
+ * Applying mapping metadata to properly route parsed logs to mappers.
+ * Transforming any variety of log data into a structured form.
+* Support the creation of high-fidelity detection rules.
+
+Parsing issues can manifest in several ways:
+* **Failed records**. Insufficient or incorrect information causing a mapping failure.
+* **Incorrect mapping**. What the mapper expects to map is not present or is different than expected.
+* **Parsing failures**. All or part of a parser is not handling logs as intended.
+* **Incorrect parsing**. Specific fields or metadata being parsed incorrectly (wrong key value pairs or `event_id` metadata).
+
+## About forwarding logs to Cloud SIEM with parsers
+
+### Forwarding methods
+
+#### _siemForward and parser (recommended)
+
+The recommended method is to set `_siemForward = true` and `_parser = `. This can be set several ways:
+* At the [source](/docs/cse/ingestion/ingestion-sources-for-cloud-siem/). Logs from an entire source will be forwarded to Cloud SIEM and the specified parser.
+* At the [collector](/docs/send-data/installed-collectors/). Logs from the collector and its child sources will be forwarded to Cloud SIEM and the specified parser
+* Using a [Field Extraction Rule (FER)](/docs/manage/field-extractions/create-field-extraction-rule/).
+ * Often used to specify SIEM forwarding and the parser path by `sourceCategory`, but can also be used to filter specific subsets of logs for forwarding to Cloud SIEM (or not forwarded).
+ * Sending subsets of logs to Cloud SIEM is useful as not all log data is useful from a security context.
+
+Many [Cloud-To-Cloud (C2C)](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/) sources set the `_parser` and `_siemForward` metadata within the parser, bypassing the need to manually specify for these sources.
+
+#### Other methods
+
+Other methods depend upon legacy methods which bypass parsers and are generally not recommended. These include:
+
+* Setting `_siemForward` without a parser.
+ * For structured logs, this will use a Sumo Logic [ingest mapping](/docs/cse/ingestion/sumo-logic-ingest-mapping/) and has limited options for specific parsing or relies on setting mapping metadata in fields or via a a field extraction rule.
+ * For unstructured logs, this will utilize legacy Grok parsers which are approaching end-of-life and are not maintained outside of critical bug fixes.
+* Older cloud-to-cloud sources set `_siemForward` and mapper metadata fields within the cloud-to-cloud source.
+
+### Best practices
+
+* Always use a parser when possible.
+ * Provides consistent field extraction.
+ * Enables proper schema mapping.
+ * Supports future content updates.
+* Avoid field extraction rules that modify `_raw`.
+ * Makes parser troubleshooting more difficult by obfuscating the format of the original raw log which the parser receives.
+* Use appropriate parser paths.
+ * Ensures parser matches the data format.
+ * Uses system parsers when available.
+ * Creates custom parsers only when necessary.
+
+Following these fundamentals will help prevent common parsing issues and simplify troubleshooting when problems occur.
+
+## Identify parser issues
+
+### Failed Records dashboard
+
+The [Enterprise Audit - Cloud SIEM app](/docs/integrations/sumo-apps/cse/) provides dashboards and queries for greater visibility into Cloud SIEM activity. Troubleshooting parser failures is aided by the [Cloud SIEM - Record Analysis - Failed Records](/docs/integrations/sumo-apps/cse/#record-analysis-failed-records) dashboard and query found within the app. (The Enterprise Audit - Cloud SIEM app must be installed).
+
+Common failure types:
+* **Parser failures**. Include parser path and specific parsing error.
+* **Mapper failures**. Usually mention mapper or mapping issues.
+* **Mixed failures**. May indicate parser output doesn't match mapper expectations.
+
+### Investigate failed records
+
+#### Identify the pattern
+
+* Look for commonalities in failed records.
+* Note specific error messages.
+* Check if failures are limited to certain sources.
+
+#### Analyze error messages
+
+Common errors:
+* `Fatal: /Parsers/System/Vendor/Product Name did not produce an event.`
+
Indicates the parser is likely failing immediately.
+* `Fatal: /Parsers/System/Vendor/Product Name parse failed.`
+
Indicates the parser is likely failing immediately.
+* `Fatal:/Parsers/System/Vendor/Product Name - transform_name - none of the transforms in cascade successfully parsed event (transform_name_1, transform_name_2).`
+
Indicates a specific component of the parser is failing. This case indicates a transform cascade in which logs may be partially parsed but are failing further into processing.
+* `Fatal:/Parsers/System/Vendor/Product Name- transform_name parse failed.`
+
Indicates a specific transform within a parser is failing. Logs may be partially parsed, but are failing further into processing.
+* `Fatal:/Parsers/System/Vendor/Product Name- transform_name - no value found in required transform target field parsed_field_name.`
+
Indicates a required key value pair is missing from the parsed log and the log is failing to parse as a result.
+* `Fatal:/Parsers/System/Vendor/Product Name- wrapper did not return the wrapped log entry.`
+
A parser utilizing a wrapper transform did not find the log that is supposed to be present, causing the parser failure.
+
+#### Check for recent changes
+
+For log sources which were previously parsed successfully:
+* Vendors will occasionally make modifications to the log format or field names within the logs which cannot be handled by existing parsers.
+* Source configuration changes to logging on the appliance, service, or application sending logs may result in parsing issues or failures.
+* Sumo Logic is continuously making updates to our parser catalog. While these changes undergo regression testing, there can be unforeseen cases not caught in testing. [Cloud SIEM content release notes](/release-notes-cse/) will note any modifications to out-of-the-box parsers by date with a brief summary of the changes.
+
+#### Other considerations
+
+Parsing failures can occur when there is no issue with the parser for a variety of reasons:
+* The parser was designed for a different version or log format than the ingested logs.
+ * A new parser may be needed to accommodate these logs. Clues this may be the case can be found by examining the parser name to see if it includes a format specifically such as JSON, LEEF, CEF, XML, CSV, and so on.
+ * If the parser name does not specify the format, the parser itself often will within the `FORMAT` stanza.
+ * Some parsers will include sample logs in comments which follow the format the parser was designed to accommodate.
+* The logs failing to parse are not security relevant. While some parsers can be designed to explicitly define what logs are supported or not, there are circumstances where this is not practical, most commonly in unstructured log formats, and logs fail to parse solely because they are not useful in Cloud SIEM.
+ * They do not contain an entity and therefore cannot be used to correlate any activity or generate signals.
+ * The activity captured in the log does not have any clear security use case.
+ * There are niche use cases which can be accounted for by customizing a parser that aren’t always practical to support globally.
+ * `Verbose` and `Debug` level logging frequently fall into this category.
+
+#### Pivot to raw logs and troubleshoot with the parser
+
+With the error(s) identified, pivot to the raw message(s) for further troubleshooting. Note the specific parser(s) which are failing.
+1. Extract `metadata_sourcemessageId` from the failed record.
+1. Use `_messageId` (same as `metadata_sourcemessageId`) in a search to locate the original raw log.
+1. Copy the raw message(s) and paste into the parser UI.
+1. Use the parser UI to search for `_messageID`(s) with the appropriate time frame to bring the logs into the UI to test.
+
+## Troubleshoot existing parsers
+
+If you have identified a log message that should be parsed by an existing parser (the format is right, there is a clear security use case, and so on) then it helps to understand the structure of the parser first to begin troubleshooting.
+
+Some parsers are very simple (most often structured log formats):
+* A format is defined.
+* Parser expects mapping metadata and where they come from to be static or come from the same templated key value pair in the log.
+* Parser expects time parsing to be formulated using a single attribute and for time to be in a single format.
+
+For simple parsers that fail it is often an edge case or a previously unseen event which diverges from the format the parser was initially developed. This often necessitates adding complexity to the parser to handle these cases, or to refactor the parser to handle logs differently, often more broadly. (With a broader case, that may be defining a higher level `event_id` which is consistent with all logs from a source).
+
+Some parsers are more complex (most often unstructured or complex structured log formats). There may be a single high level format a log takes, but:
+* There are nested structures within the log that differ from the overall structure. This occurs frequently with JSON sources with a nested message using human readable or key value pairs in non-JSON format within certain fields.
+* Individual event types from a source follow different conventions for labeling key value pairs.
+ * This particularly can make defining mapping metadata challenging as it will require individual event types define these attributes per-transform instead of for an entire source.
+ * Timestamps may be stored in different places depending on the nature of an event type.
+ * A transaction log may have a concept of a start and end time.
+ * Multiple timestamps may be present depending on the event type.
+ * Some logs may be missing a timestamp and `_messagetime` from the Sumo Logic collector or source may need to be used/fallen back on.
+* Unstructured logs with many different event types or variations between events.
+ * Each event type must be handled by its own transform and often requires a regular expression to parse.
+ * These will often use variable transforms and/or transform cascades.
+
+### Example scenario - Linux syslog parsing failure
+
+This is a particularly illustrative example of how a more complex parser processes a log.
+
+Example:
+
`Fatal: /Parsers/System/Linux/Linux OS Syslog - parse_systemd - none of the transforms in cascade successfully parsed event (parse_systemd_format_1, parse_systemd_format_2).`
+
+For log:
+
`Nov 20 21:11:08 ip-1-2-3-4 systemd[1]: motd-news.service: Deactivated successfully.`
+
+In this parser, the log is first processed for its header to determine how it should be routed in the parser. After that routing it is failing to parse specifically in a parser cascade, and then failing to match either `parse_systemd_format_1` or `parse_systemd_format_2` which are the transforms called in the cascade for the specific log. While still failing, this provides useful clues as to where the failure occurred.
+
+
+
+Here we can see the header and process are parsed successfully. Examining the parser we find that there is a `VARIABLE_TRANSFORM` which uses the syslog process to route the logs to another transform.
+
+```
+# Direct parser based on the syslog process name
+# Process Name = Transform Parser
+VARIABLE_TRANSFORM_INDEX:syslog_message = syslog_process
+```
+
+In this case there is a transform called for `systemd` processes called `parse_systemd` which takes the contents of `syslog_message` and passes it along to the named transform. Looking further down the parser we can find that specific transform.
+
+```
+transform:parse_systemd
+TRANSFORM_CASCADE:_$log_entry = parse_systemd_format_1,parse_systemd_format_2
+```
+
+This particular transform passes along the contents of what it received from the variable transform and instructs it to pass along the field value stored in `_$log_entry` (syslog_message) to two additional parse transforms which it then attempts to use in the order shown in the transform cascade until a match is found.
+
+```
+[transform:parse_systemd_format_1]
+#<86>Jan 01 00:00:00 hostname systemd[20460]: pam_unix(systemd-user:session): session opened for user root by (uid=0)
+TRANSFORM_CASCADE:_$log_entry = parse_su_format_6,parse_sudo_format_2,parse_systemd_user_format_1
+
+[transform:parse_systemd_format_2]
+#<30>Jan 01 00:00:00 hostname systemd[1]: Started Session c513458 of user ewqadm.
+REGEX = (?PStarted\sSession)\s(?P\S+)\sof\s(?i)user\s(?P.+)\.
+SET:event_id = systemd-session-start
+```
+
+In these transforms we helpfully have an example log for which the transform is intended to parse using a regular expression and then sets an `event_id` if there is a match. From the examples and the regex we can clearly see that neither transform is intended to match the example log we are seeing a failure for.
+
+Were this a useful log from a security context (it’s not) the failure could be addressed in a few ways, either by modifying one of the existing transform regular expressions (whichever is closer in format) or by creating a new transform as part of the transform cascade being called for the systemd process log. Since the particular log example is a significant departure from the intent of either existing transforms, a new one would be most appropriate. It would only require a modification to the `parse_systemd` transform cascade and the addition of a third transform with a regular expression to handle the particular log and then set the appropriate `event_id`.
+
+## Escalate parsing issues
+
+Sumo Logic Threat Labs Detection Engineering maintains all out-of-the-box Cloud SIEM content. Content includes parsers, mappers, rules, and normalization schema. For details about this content, see the [Cloud SIEM Content Catalog](/docs/cse/get-started-with-cloud-siem/cloud-siem-content-catalog/).
+
+Upon identifying an issue with a Cloud SIEM out-of-the-box parser using this article, it may be necessary to [contact Support](https://support.sumologic.com/support/s/) to escalate the issue to the Threat Labs team.
+
+### Escalation requirements
+
+Provide the following:
+* A concise description of the problem. Screenshots are helpful for understanding current and potentially previous behavior if there was a change.
+* A representative raw log sample.
+ * Include logs which represent the issue.
+ * For new support or extension of an already supported source, include logs which represent the types of (possibly new) events.
+* Security use case if it is not immediately obvious. Not all logs necessarily will be useful in Cloud SIEM.
+* Supporting documentation. Esoteric or poorly labelled values may require documentation which is not always publicly available.
+* Configuration information. Many data sources will have options for configuring logging. It is important to understand what those settings are to develop new global support for a data source or offer advice for a custom solution if a global one is not appropriate.
+
+### Gather raw samples
+
+Prior to opening a support request, it is helpful to gather sample raw logs (without field extraction rules overwriting `_raw`) which represent the identified issue.
+
+Once a representative sample has been gathered it is recommended to export it as a CSV from Sumo Logic search to ensure no extraneous formatting is applied that might confound further troubleshooting by Sumo Logic Customer Success and Threat Labs teams.
+
+Following are some ways to gather samples.
+
+#### Search for message IDs
+
+Gather a sample by searching for the logs that are failing to parse using the identified `_messageId`(s):
+```
+_sourceCategory=
+| where _messageId in (,,)
+```
+
+#### Identify an event ID
+
+If an entire source is failing to parse because it is not presently supported out-of-the-box, gather a sample by first identifying a likely event ID:
+```
+_sourceCategory=
+| fields eventID
+| first(_raw) by eventID
+```
+An event ID can be thought of as a key in a log which represents a lowest common denominator which can be used to identify the type of message.
+
+Utilizing the first operator lets you cut down on extraneous samples. It may be necessary to gather multiple examples if there is meaningful variation within each failing sample that may require further parsing beyond per-event-ID.
+
+#### Gather failing event IDs
+
+If a particular event ID or IDs as set in the parser using `MAPPER:event_id` are failing, gather the failing event IDs. Construct a search query using the failing event IDs and gather unique examples of each `_raw` per event ID:
+```
+_sourceCategory=
+| fields eventID
+| where eventID in ("list","of","eventIDs")
+| first(_raw) by eventID
+```
+The first field will be `_raw` log.
+
+This query only surfaces the first example within the time range to cut down on duplication. It may be necessary to gather multiple examples if there is meaningful variation within each failing sample that may require further parsing changes.
\ No newline at end of file
diff --git a/sidebars.ts b/sidebars.ts
index b9d2548ef2..65142652a2 100644
--- a/sidebars.ts
+++ b/sidebars.ts
@@ -2800,7 +2800,6 @@ integrations: [
'cse/schema/parser-editor',
'cse/schema/parsing-language-reference-guide',
'cse/schema/parsing-patterns',
- 'cse/schema/parser-troubleshooting-tips',
'cse/schema/username-and-hostname-normalization',
],
},
@@ -2885,6 +2884,17 @@ integrations: [
'cse/administration/mitre-coverage',
],
},
+ {
+ type: 'category',
+ label: 'Troubleshoot',
+ collapsible: true,
+ collapsed: true,
+ link: {type: 'doc', id: 'cse/troubleshoot/index'},
+ items: [
+ 'cse/troubleshoot/troubleshoot-parsers',
+ 'cse/troubleshoot/troubleshoot-mappers',
+ ],
+ },
],
},
{
diff --git a/static/img/cse/troubleshoot-parser-example.png b/static/img/cse/troubleshoot-parser-example.png
new file mode 100644
index 0000000000..868d68a425
Binary files /dev/null and b/static/img/cse/troubleshoot-parser-example.png differ