Skip to content
Merged
Show file tree
Hide file tree
Changes from 21 commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 82 additions & 0 deletions .github/workflows/pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,88 @@ jobs:
run: yarn install --frozen-lockfile
- name: Build the Docusaurus site
run: yarn build
# --- Disallowed character checks for Pantheon ---
# 1) Fail if any file path in the build output contains disallowed characters
- name: Check build artifact paths for disallowed characters
env:
BUILD_DIR: build
shell: bash
run: |
if [ ! -d "$BUILD_DIR" ]; then
echo "No $BUILD_DIR directory found; skipping path check."
exit 0
fi
# Disallowed chars: " : < > | * ?
OFFENDERS="$(find "$BUILD_DIR" -type f | grep -E '[":<>|*?]')" || true
if [ -n "$OFFENDERS" ]; then
echo "❌ Disallowed characters found in build artifact paths:"
echo "$OFFENDERS"
while IFS= read -r p; do
rel="${p#$(pwd)/}"
[ "$rel" = "$p" ] && rel="$p"
badchars=$(echo "$rel" | grep -oE '[":<>|*?]' | tr -d '\n')
echo "::error file=${rel}::Disallowed character(s) found in build artifact path: [${badchars}] (Pantheon rejects paths containing any of: \" : < > | * ?)"
done <<< "$OFFENDERS"
exit 1
fi
echo "✅ No disallowed characters found in build artifact paths."
# 2) Fail if any Markdown/MDX link target contains disallowed characters (excluding external links).
# Ignores fenced code blocks (```/~~~) and inline code (`...`) to avoid false positives from code samples.
- name: Check Markdown/MDX links for disallowed characters
shell: bash
run: |
FILES="$(git ls-files '*.md' '*.mdx' 2>/dev/null || true)"
if [ -z "$FILES" ]; then
echo "No Markdown/MDX files found; skipping link check."
exit 0
fi
BAD_LINKS=$(
awk '
BEGIN { in_code=0 }
{
raw=$0

# Toggle fenced code blocks starting with ``` or ~~~
if (match(raw, /^\s*(```|~~~)/)) { in_code = !in_code; next }
if (in_code) { next }

# Strip inline code spans so patterns inside don’t trigger
line = raw
gsub(/`[^`]*`/, "", line)

# Find real Markdown links: [text](url)
while (match(line, /\[[^]]+\]\(([^)]+)\)/, m)) {
url=m[1]

# Skip external schemes
if (url ~ /^(https?:|mailto:|tel:)/) { line=substr(line, RSTART+RLENGTH); continue }

# Drop query + fragment
sub(/\?.*$/, "", url)
sub(/#.*/, "", url)

# Disallowed characters in relative link targets: " : < > | * ?
if (url ~ /[":<>|*?]/) {
printf("%s\t%d\t%s\n", FILENAME, FNR, m[0])
}

line=substr(line, RSTART+RLENGTH)
}
}
' $FILES
)

if [ -n "$BAD_LINKS" ]; then
echo "❌ Disallowed characters found in Markdown/MDX link targets:"
echo "$BAD_LINKS"
while IFS=$'\t' read -r file line match; do
[ -z "$file" ] && continue
badchars=$(echo "$match" | grep -oE '[":<>|*?]' | tr -d '\n')
echo "::error file=${file},line=${line}::Disallowed character(s) in Markdown link target: [${badchars}] ${match} (Not allowed: \" : < > | * ?)"
done <<< "$BAD_LINKS"
exit 1
fi
echo "✅ No disallowed characters found in Markdown/MDX link targets."
spellcheck:
runs-on: ubuntu-latest
steps:
Expand Down
120 changes: 60 additions & 60 deletions blog-service/2021/12-31.md

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion cid-redirects.json
Original file line number Diff line number Diff line change
Expand Up @@ -2945,7 +2945,7 @@
"/cid/1108": "/docs/integrations/saas-cloud/trellix-mvision-epo",
"/cid/1110": "/docs/integrations/microsoft-azure/azure-security-microsoft-defender-for-identity",
"/docs/integrations/microsoft-azure/microsoft-defender-for-identity/": "/docs/integrations/microsoft-azure/azure-security-microsoft-defender-for-identity",
"/cid/1112": "/docs/integrations/saas-cloud/carbon-black-inventory/",
"/cid/1112": "/docs/integrations/saas-cloud/carbon-black-inventory/",
"/cid/1111": "/docs/integrations/microsoft-azure/azure-open-ai",
"/Cloud_SIEM_Enterprise": "/docs/cse",
"/Cloud_SIEM_Enterprise/Administration": "/docs/cse/administration",
Expand Down
2 changes: 1 addition & 1 deletion docs/get-started/apps-integrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ The **Search** page opens, the search populates a new tab, and the search runs
If you do not have data that matches the requirements of the search query, or if you select the incorrect Source Category or data filter, you will either get no results, or bad results.

:::note
Searches included with the [Sumo Logic app for Data Volume](/docs/integrations/sumo-apps/data-volume "Data Volume app") do not require you to select a Source Category.
Searches included with the [Sumo Logic app for Data Volume](/docs/integrations/sumo-apps/data-volume) do not require you to select a Source Category.
:::

## Custom data filters
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ import AccessKey from '../../../../reuse/automation-service/access-key.md';

<img src={useBaseUrl('/img/platform-services/automation-service/app-central/integrations/atlassian-jira-v2/atlassian-jira-v2-5.png')} style={{border:'1px solid gray'}} alt="atlassian-jira-v2-5" width="400"/>

For information about Atlassian Jira, see [Jira documentation]( https://confluence.atlassian.com/jira). For the REST API v2, see the [REST API v2 documentation](https://developer.atlassian.com/cloud/jira/platform/rest/v2/intro/).
For information about Atlassian Jira, see [Jira documentation](https://confluence.atlassian.com/jira). For the REST API v2, see the [REST API v2 documentation](https://developer.atlassian.com/cloud/jira/platform/rest/v2/intro/).

## Category

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ import AccessKey from '../../../../reuse/automation-service/access-key.md';

<img src={useBaseUrl('/img/platform-services/automation-service/app-central/integrations/atlassian/atlassian-jira-configuration.png')} style={{border:'1px solid gray'}} alt="Atlassian Jira Logger configuration" width="400"/>

For information about Atlassian Jira, see [Jira documentation]( https://confluence.atlassian.com/jira).
For information about Atlassian Jira, see [Jira documentation](https://confluence.atlassian.com/jira).

## Change Log

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ On the **Search** page, as you begin typing to enter a query in the search tex

RBAC limitations prevent you from seeing options that you are not permitted to see. 

In the first part of a query, search autocomplete suggests common default queries, keywords, and [metadata](built-in-metadata.md "Search Metadata") terms. It also offers the names of Collectors, Sources, and Partitions, which are automatically configured in your system when you create them.
In the first part of a query, search autocomplete suggests common default queries, keywords, and [metadata](built-in-metadata.md) terms. It also offers the names of Collectors, Sources, and Partitions, which are automatically configured in your system when you create them.

![autocomplete search](/img/search/get-started-search/search-basics/autocomplete-search.png)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,15 +25,15 @@ As a best practice, Sumo Logic recommends batching data into each POST request t

To configure an HTTP Logs and Metrics Source:

1. [**New UI**](/docs/get-started/sumo-logic-ui). In the Sumo Logic main menu select **Data Management**, and then under **Data Collection** select **Collection**. You can also click the **Go To...** menu at the top of the screen and select **Collection**.  <br/>[**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Collection > Collection**.
1. [**New UI**](/docs/get-started/sumo-logic-ui). In the Sumo Logic main menu select **Data Management**, and then under **Data Collection** select **Collection**. You can also click the **Go To...** menu at the top of the screen and select **Collection**.  <br/>[**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Collection > Collection**.
1. In the Collectors page, click **Add Source** next to a Hosted Collector.
1. Select **HTTP Logs & Metrics**. 
1. Enter a **Name** to display for the Source in the Sumo web application. Description is optional.
1. (Optional) For **Source Host **and** Source Category**, enter any string to tag the output collected from the source. (Category metadata is stored in a searchable field called _sourceCategory.)
1. **Forward to SIEM**. This option is present if [Cloud SIEM](/docs/cse/) is enabled. Click the checkbox to send the logs collected by the source to Cloud SIEM.
1. **Fields/Metadata.** Click the **+Add** link to define the fields you want to associate. Each field needs a name (key) and value.
* <img src={useBaseUrl('img/reuse/green-check-circle.png')} alt="green check circle.png" width="20"/> A green circle with a check mark is shown when the field exists and is enabled in the Fields table schema.
* <img src={useBaseUrl('img/reuse/orange-exclamation-point.png')} alt="orange exclamation point.png" width="20"/> An orange triangle with an exclamation point is shown when the field doesn't exist in the Fields table schema. In this case, you'll see an option to automatically add or enable the nonexistent fields to the Fields table schema. If a field is sent to Sumo Logic but isn’t present or enabled in the schema, it’s ignored and marked as **Dropped**.
* <img src={useBaseUrl('img/reuse/orange-exclamation-point.png')} alt="orange exclamation point.png" width="20"/> An orange triangle with an exclamation point is shown when the field doesn't exist in the Fields table schema. In this case, you'll see an option to automatically add or enable the nonexistent fields to the Fields table schema. If a field is sent to Sumo Logic but isn’t present or enabled in the schema, it’s ignored and marked as **Dropped**.
1. **Advanced Options for Logs.** Advanced options do *not* apply to uploaded metrics.<br/><img src={useBaseUrl('img/send-data/HTTP-source-advanced-options-for-logs.png')} alt="A screenshot of the 'Advanced Options for Logs' settings in Sumo Logic. The options include 'Extract timestamp information from log file entries' (checked), 'Default Time Zone' with options to 'Use time zone from log file. If not detected, use default time zone' (selected) and 'Ignore time zone from log file and instead use default time zone'. The 'Timestamp Format' settings offer 'Automatically detect the format' (selected) and 'Specify a format'. The 'Message Processing' section has 'Multiline Processing' checked. The 'Infer Message Boundaries' options include 'Detect Automatically' (selected) and 'Add Boundary Regex'. Finally, there is an unchecked option for 'One Message Per Request', which notes that each request will be treated as a single message, ignoring line breaks." width="400"/>
* **Timestamp Parsing.** This option is selected by default. If it's deselected, no timestamp information is parsed at all.
* **Time Zone.** There are two options for Time Zone. You can use the time zone present in your log files, and then choose an option in case time zone information is missing from a log message. Or, you can have Sumo Logic completely disregard any time zone information present in logs by forcing a time zone. It's very important to have the proper time zone set, no matter which option you choose. If the time zone of logs cannot be determined, Sumo Logic assigns logs UTC; if the rest of your logs are from another time zone your search results will be affected.
Expand Down Expand Up @@ -79,7 +79,7 @@ Below are the key benefits that you can obtain by sending compressed data:
- **Faster message delivery**. Improved efficiency ensures messages are received more quickly at Sumo Logic.

:::important
- Compressed data can only be sent with the POST method.
- Compressed data can only be sent with the POST method.
- Compressed files are decompressed before they are ingested, so they are ingested at the decompressed file size rate. 
:::

Expand All @@ -105,6 +105,6 @@ Also, in your HTTP Source configuration, make sure that the check box **Enable O

Sumo expects that the entire content of an individual log message will be sent to Sumo within the same HTTP request. Multiline processing rules are only applied within the bounds of the data sent within a single HTTP request. This means that a multiline log that is sent to Sumo across multiple HTTP requests will not be detected as a single message. It will be broken into separate log messages. Sumo does not currently have the ability to detect and thread together a distinct log message that has been sent via multiple HTTP requests. 

For tools to help you batch messages, see [Sumo Logic .NET Appenders]( https://github.com/SumoLogic/sumologic-net-appenders).
For tools to help you batch messages, see [Sumo Logic .NET Appenders](https://github.com/SumoLogic/sumologic-net-appenders).

For details on how the Collector processes multiline logs, see [Collecting Multiline Logs](/docs/send-data/reference-information/collect-multiline-logs).