diff --git a/docs/integrations/app-development/jfrog-xray.md b/docs/integrations/app-development/jfrog-xray.md
index 94e15270ae..889d35e3d8 100644
--- a/docs/integrations/app-development/jfrog-xray.md
+++ b/docs/integrations/app-development/jfrog-xray.md
@@ -6,15 +6,15 @@ description: The JFrog Xray app provides visibility into the state of artifacts
---
import useBaseUrl from '@docusaurus/useBaseUrl';
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
-
The JFrog Xray app provides visibility into the state of artifacts and components in your JFrog Artifactory repository. The pre-configured dashboards present information about issues detected in your software components in Artifactory, including vulnerable containers, artifacts and components; license and security issues; and top Common Vulnerabilities and Exposures (CVEs). The app also helps identify all incoming threats detected via Sumo Logic Threat Intel.
The Sumo Logic app for JFrog Xray and collection are tested on JFrog Xray 2.9.0 version.
-
## Log types
The JFrog Xray app uses the following log types:
@@ -23,8 +23,6 @@ The JFrog Xray app uses the following log types:
* Artifactory logs. For more information, see [Collecting logs](/docs/integrations/app-development/jfrog-artifactory/#collecting-logs).
* Kubernetes logs. For more information, see [Collecting Metrics and Logs for the Kubernetes app](/docs/integrations/containers-orchestration/kubernetes#collecting-metrics-and-logs-for-the-kubernetes-app).
-
-
### Sample log messages
@@ -71,8 +69,6 @@ The JFrog Xray app uses the following log types:
}
```
-
-
### Sample queries
The sample query is from Watches Invoked panel of the **JFrog Xray - Overview** dashboard.
@@ -89,8 +85,49 @@ _sourceCategory = Labs/jfrog/xray
| json field=File "path", "depth", "sha256", "name", "parent_sha", "display_name", "pkg_type" as ComponentPath, ComponentDepth, ComponentSha, ComponentName, ComponentParentSha, ComponentDisplayName, ComponentPkgType nodrop
| count_distinct(WatchName) as %"Number of Watches"
```
+## Collection configuration and app installation
+
+Choose one of the following methods to configure the JFrog Xray source and install the app:
+
+
+
+
+
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
+
+
+
+:::important
+Use the [Cloud-to-Cloud Integration for JFrog Xray](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/jfrog-xray-source/) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your JFrog Xray app is properly integrated and configured to collect and analyze your JFrog Xray data.
+:::
+
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
-## Collecting logs for JFrog Xray
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
+
+
+
+
+
+
This section explains how to collect logs from JFrog Xray and ingest them into Sumo Logic for use with the JFrog Xray pre-defined dashboards and searches. To get the most of out this app, we recommend you also collect logs from Artifactory as well as Kubernetes.
@@ -104,21 +141,18 @@ Collect the following details:
* Port = **8000**
* Your Username and Password for your JFrog Xray instance
-
### Step 2: Collect Artifactory logs
We recommend collecting data from JFrog Artifactory so as to investigate sources of vulnerable artifacts and who is using them. This is done by correlating Xray logs with Artifactory logs.
To do so, follow the instructions in [Collect Logs for Artifactory](/docs/integrations/app-development/jfrog-artifactory#collecting-logs).
-
### Step 3: Collect Kubernetes logs
If you have set up a Docker repository in Artifactory and are running containers in a Kubernetes cluster, we recommend collecting data from your Kubernetes cluster so as to understand all vulnerable containers running in production.
To perform this setup, follow the instructions in [Collect Logs for Kubernetes](/docs/integrations/containers-orchestration/kubernetes#collecting-metrics-and-logs-for-the-kubernetes-app).
-
### Step 4: Add Hosted Collector and HTTP Source
In this step you set up a hosted Sumo Logic collector and HTTP source to collect JFrog Xray logs.
@@ -131,7 +165,6 @@ To add a hosted collector and HTTP source:
1. Create a new Sumo Logic hosted collector by performing the steps in [Configure a Hosted Collector](/docs/send-data/hosted-collectors/configure-hosted-collector).
2. Create a new HTTP source on the hosted collector created above by following instructions in [HTTP Logs and Metrics Source]](/docs/send-data/hosted-collectors/http-source/logs-metrics).
-
### Step 5: Set up a collection method for JFrog Xray
This section covers the various ways in which to collect logs from JFrog Xray and send them to Sumo Logic. The logs are then shown in dashboards as part of the JFrog Xray App. You can configure a Sumo Logic collector for JFrog Xray in Amazon Web Services (AWS) using AWS Lambda service, or use a script on a Linux machine with a cron job. Choose the method that best suits your environment:
@@ -144,18 +177,15 @@ In this collection method, you deploy the SAM application, which creates the nec
To deploy the Sumo Logic JFrog xray SAM Application, do the following:
1. Go to [https://serverlessrepo.aws.amazon.com/applications](https://serverlessrepo.aws.amazon.com/applications).
-2. Search for **sumologic-jfrog-xray** and make sure the checkbox **Show apps that create custom IAM roles or resource policies** is checked, and click the app link when it appears.
-
+1. Search for **sumologic-jfrog-xray** and make sure the checkbox **Show apps that create custom IAM roles or resource policies** is checked, and click the app link when it appears.
1. When the page for the Sumo app appears, click **Deploy**.
-2. Go to the **AWS Lambda > Functions >** **Application Settings** panel, and enter parameters for the following fields:
-* **HTTPLogsEndpoint**: Copy and paste the URL for the HTTP log source from [Step 4](#step-4-add-hosted-collector-and-http-source).
-* **Hostname**: Copy and paste the Hostname from [Step 1](#step-1-collect-jfrog-xray-instance-details).
-* **Port**: Copy and paste the Port from [Step 1](#step-1-collect-jfrog-xray-instance-details).
-* **Username**: Copy and paste the Username from [Step 1](#step-1-collect-jfrog-xray-instance-details).
-* **Password**: Copy and paste the Password from [Step 1](#step-1-collect-jfrog-xray-instance-details).
-
-1. Click **Deploy.**
-
+1. Go to the **AWS Lambda > Functions >** **Application Settings** panel, and enter parameters for the following fields:
+ * **HTTPLogsEndpoint**. Copy and paste the URL for the HTTP log source from [Step 4](#step-4-add-hosted-collector-and-http-source).
+ * **Hostname**. Copy and paste the Hostname from [Step 1](#step-1-collect-jfrog-xray-instance-details).
+ * **Port**. Copy and paste the Port from [Step 1](#step-1-collect-jfrog-xray-instance-details).
+ * **Username**. Copy and paste the Username from [Step 1](#step-1-collect-jfrog-xray-instance-details).
+ * **Password**. Copy and paste the Password from [Step 1](#step-1-collect-jfrog-xray-instance-details).
+5. Click **Deploy**.
#### Optional - Configure multiple JFrog Xray instances
@@ -182,7 +212,6 @@ sudo su
```
* A Linux machine compatible with either Python 3.7 or Python 2.7
-
#### Step 1. Configure the script on a Linux machine
This task shows you how to install the script on a Linux machine.
@@ -191,29 +220,27 @@ For Python 3 you will use pip3 install **sumologic-jfrog-xray** (step 3 in the f
To deploy the script, do the following:
1. If **pip** is not already installed, follow the instructions in the [pip documentation](https://pip.pypa.io/en/stable/installing/) to download and install **pip**.
-2. Log in to a Linux machine compatible with either Python 3.7 or Python 2.7.
-3. Do one of the following:
-* For Python 2 - run the following command:
+1. Log in to a Linux machine compatible with either Python 3.7 or Python 2.7.
+1. Do one of the following:
+ * For Python 2 - run the following command:
```bash
pip install sumologic-jfrog-xray
```
-* For Python 3 - run the following command:
+ * For Python 3 - run the following command:
```bash
pip3 install sumologic-jfrog-xray
```
-1. Create a configuration file **jfrogxraycollector.yaml** in the home directory as shown below, and fill in the parameter `` where indicated.
-
-1. Create a cron job to run the collector every 5 minutes, (use the crontab -e option), in one of the following ways:
-* For Python 2 - add the following line in your crontab:
+4. Create a configuration file **jfrogxraycollector.yaml** in the home directory as shown below, and fill in the parameter `` where indicated.
+5. Create a cron job to run the collector every 5 minutes, (use the crontab -e option), in one of the following ways:
+ * For Python 2 - add the following line in your crontab:
```sql
*/5 * * * * /usr/bin/python -m sumojfrogxray.main > /dev/null 2>&1
```
-* For Python 3 - add the following line in your crontab:
+ * For Python 3 - add the following line in your crontab:
```sql
*/5 * * * * /usr/bin/python3 -m sumojfrogxray.main > /dev/null 2>&1
```
-
**Optional - Configure collection for multiple projects**
If you have multiple projects from which you want to collect logs and send to Sumo Logic, perform the following task.
@@ -241,10 +268,9 @@ This section provides a list of variables for Jfrog Xray that you can define in
| TIMEOUT in Collection Section | Request timeout used by the requests library. |
| HTTP_LOGS_ENDPOINT in Sumo Logic Section | HTTP source endpoint URL created in Sumo Logic for ingesting logs. |
-
-## Troubleshooting
+### Troubleshooting
This section shows you how to run the function manually and then verify that log messages are being sent from JFrog Xray.
@@ -265,11 +291,13 @@ sudo yum -y install gcc
sudo yum install python-devel
```
-## Installing the JFrog Xray app
+### Installing the JFrog Xray app
import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+
+
## Viewing JFrog Xray dashboards
diff --git a/docs/integrations/google/bigquery.md b/docs/integrations/google/bigquery.md
index 6eb8956f77..2921a7e2af 100644
--- a/docs/integrations/google/bigquery.md
+++ b/docs/integrations/google/bigquery.md
@@ -6,6 +6,8 @@ description: The Google BigQuery App helps you monitor data and activity in your
---
import useBaseUrl from '@docusaurus/useBaseUrl';
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
@@ -28,8 +30,49 @@ _sourceCategory=*gcp* logName resource "type":"bigquery_resource"
| transpose row _timeslice column project
```
+## Collection configuration and app installation
-## Collecting logs for the Google BigQuery app
+Choose one of the following methods to configure the Google BigQuery source and install the app:
+
+
+
+
+
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
+
+
+
+:::important
+Use the [Cloud-to-Cloud Integration for Google BigQuery](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/google-bigquery-source/) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Google BigQuery app is properly integrated and configured to collect and analyze your Google BigQuery data.
+:::
+
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
+
+
+
+
+
+
This section describes the Sumo pipeline for ingesting logs from Google Cloud Platform (GCP) services, and provides instructions for configuring log collection for the Google BigQuery App.
@@ -41,19 +84,19 @@ The GCP service generates logs which are exported and published to a Google Pub/
-### Configuring collection for GCP uses the following process:
+### Configuring collection for GCP
-1. Configure a GCP source on a hosted collector. You'll obtain the **HTTP URL for the source**.
-2. Create a topic in Google Pub/Sub and subscribe the GCP source URL to that topic.
-3. Create an export of GCP logs from Google Stackdriver Logging. Exporting involves writing a filter that selects the log entries you want to export, and choosing a Pub/Sub as the destination. The filter and destination are held in an object called a sink.
+Configuring collection for GCP uses the following process:
-See the following sections for configuration instructions.
+[Step 1: Configure a Google Cloud Platform Source](#step-1-configure-a-google-cloud-platform-source). Configure a GCP source on a hosted collector. You'll obtain the **HTTP URL for the source**.
+[Step 2: Configure a Pub/Sub Topic for GCP](#step-2-configure-a-pubsub-topic-for-gcp). Create a topic in Google Pub/Sub and subscribe the GCP source URL to that topic.
+[Step 3: Create export of Google BigQuery logs from Google Logging](#step-3-create-export-of-google-bigquery-logs-from-google-logging). Create an export of GCP logs from Google Stackdriver Logging. Exporting involves writing a filter that selects the log entries you want to export, and choosing a Pub/Sub as the destination. The filter and destination are held in an object called a sink.
:::note
Logs from GCP services can be [exported](https://cloud.google.com/logging/docs/export/configure_export_v2) to any destination including Stackdriver. It is not required to push the GCP logs into Stackdriver for the Sumo Logic Apps to work. Any GCP logs can be [excluded](https://cloud.google.com/logging/docs/exclusions) from Stackdriver logging and still can be [exported](https://cloud.google.com/logging/docs/export/) to Sumo logic.
:::
-### Configure a Google Cloud Platform Source
+#### Step 1: Configure a Google Cloud Platform Source
The Google Cloud Platform (GCP) Source receives log data from Google Pub/Sub.
@@ -68,21 +111,20 @@ This Source will be a Google Pub/Sub-only Source, which means that it will only
1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Collection > Collection**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the Sumo Logic top menu select **Configuration**, and then under **Data Collection** select **Collection**. You can also click the **Go To...** menu at the top of the screen and select **Collection**.
2. Select an existing Hosted Collector upon which to add the Source. If you do not already have a Collector you'd like to use, create one, using the instructions on [Configure a Hosted Collector](/docs/send-data/hosted-collectors/configure-hosted-collector).
3. Click **Add Source** next to the Hosted Collector and click **Google Cloud Platform**.
-4. Enter a **Name** to display for the Source. A **Description** is optional.
+4. Enter a **Name** to display for the Source. A **Description** is optional.
5. **Source Host** (Optional). The Source Host value is tagged to each log and stored in a searchable [metadata](/docs/search/get-started-with-search/search-basics/built-in-metadata) field called _sourceHost. Avoid using spaces so you do not have to quote them in [keyword search expressions](/docs/search/get-started-with-search/build-search/keyword-search-expressions.md). This can be a maximum of 128 characters.
6. **Source Category** (Optional). The Source Category value is tagged to each log and stored in a searchable [metadata](/docs/search/get-started-with-search/search-basics/built-in-metadata) field called `_sourceCategory`. See our [Best Practices: Good Source Category, Bad Source Category](/docs/send-data/best-practices). Avoid using spaces so you do not have to quote them in [keyword search expressions](/docs/search/get-started-with-search/build-search/keyword-search-expressions.md). This can be a maximum of 1,024 characters.
7. **Fields**. Click the **+Add Field** link to add custom log metadata [Fields](/docs/manage/fields), then define the fields you want to associate. Each field needs a name (key) and value. Look for one of the following icons and act accordingly:
*  If an orange triangle with an exclamation point is shown, use the option to automatically add or enable the nonexistent fields before proceeding to the next step. The orange icon indicates that the field doesn't exist, or is disabled, in the Fields table schema. If a field is sent to Sumo that does not exist in the Fields schema or is disabled it is ignored, known as dropped.
*  If a green circle with a checkmark is shown, the field exists and is already enabled in the Fields table schema. Proceed to the next step.
-8. **Advanced Options for Logs**.
+8. **Advanced Options for Logs**.
* **Timestamp Parsing**. This option is selected by default. If it's deselected, no timestamp information is parsed at all.
* **Time Zone**. There are two options for Time Zone. You can use the time zone present in your log files, and then choose an option in case time zone information is missing from a log message. Or, you can have Sumo Logic completely disregard any time zone information present in logs by forcing a time zone. It's very important to have the proper time zone set, no matter which option you choose. If the time zone of logs cannot be determined, Sumo Logic assigns logs UTC; if the rest of your logs are from another time zone your search results will be affected.
* **Timestamp Format**. By default, Sumo Logic will automatically detect the timestamp format of your logs. However, you can manually specify a timestamp format for a Source. See [Timestamps, Time Zones, Time Ranges, and Date Formats](/docs/send-data/reference-information/time-reference) for more information.
9. **Processing Rules**. Configure any desired filters, such as allowlist, denylist, hash, or mask, as described in [Create a Processing Rule](/docs/send-data/collection/processing-rules/create-processing-rule).
10. When you are finished configuring the Source, click **Save**.
-
-### Configure a Pub/Sub Topic for GCP
+#### Step 2: Configure a Pub/Sub Topic for GCP
You need to configure a Pub/Sub Topic in GCP and add a subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. Once you configure the Pub/Sub, you can export data from Google Logging to the Pub/Sub. For example, you can export Google App Engine logs, as described on [Collect Logs for Google App Engine](/docs/integrations/google/app-engine#collecting-logs-for-the-google-app-engine-app).
@@ -90,8 +132,7 @@ You need to configure a Pub/Sub Topic in GCP and add a subscription to the Sourc
2. Create a Pub/Sub subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. See [Google Cloud documentation](https://cloud.google.com/pubsub/docs/admin#creating_subscriptions) for the latest configuration steps.
* Use a **Push Delivery Method** to the Sumo Logic Source URL. To determine the URL, navigate to the Source on the **Collection** page in Sumo Logic and click **Show URL**.
-
-### Limitations
+##### Limitations
Google limits the volume of data sent from a Topic. Our testing resulted in the following data limits:
@@ -108,22 +149,20 @@ We recommend the following:
* Shard messages across topics within the above data limits.
* Ask GCP to increase the allowable capacity for the topic.
-
-### Create export of Google BigQuery logs from Google Logging
+#### Step 3: Create export of Google BigQuery logs from Google Logging
In this step you export logs to the Pub/Sub topic you created in the previous step.
-1. Go to **Logging** and click **Logs Router**.
+1. Go to **Logging** and click **Logs Router**.
2. Click **Create Sink**.
3. As part of **Create logs routing sink**, add the following information.
1. Enter a Sink Name. For example, "gce-vm-instance".
2. Select "Cloud Pub/Sub" as the **Sink Service**.
3. Set **Sink Destination** to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
- 4. In **Choose logs to include in sink** section for resource_type, replace "``" with "`bigquery_resource`".
+ 4. In **Choose logs to include in sink** section for resource_type, replace "``" with "`bigquery_resource`".
5. Click **Create Sync**.
-
-## Installing the Google BigQuery app
+### Installing the Google BigQuery app
Now that you have set up log collection, you can install the Google BigQuery App to use the pre-configured searches and dashboards that provide visibility into your environment for real-time analysis of overall usage.
@@ -131,6 +170,9 @@ import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+
+
+
## Viewing Google BigQuery dashboards
import ViewDashboards from '../../reuse/apps/view-dashboards.md';
diff --git a/docs/integrations/saas-cloud/crowdstrike-falcon-filevantage.md b/docs/integrations/saas-cloud/crowdstrike-falcon-filevantage.md
index b36f852326..db02ce151f 100644
--- a/docs/integrations/saas-cloud/crowdstrike-falcon-filevantage.md
+++ b/docs/integrations/saas-cloud/crowdstrike-falcon-filevantage.md
@@ -113,15 +113,33 @@ _sourceCategory="Labs/CrowdStrikeFalconFileVantage" entity_type file
| sort by frequency, action_type
```
-## Set up collection
+## Collection configuration and app installation
-Follow the instructions provided to set up [Cloud-to-Cloud Integration for CrowdStrike Falcon FileVantage Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/crowdstrike-filevantage-source/) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your CrowdStrike Falcon FileVantage app is properly integrated and configured to collect and analyze your CrowdStrike Falcon FileVantage data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the CrowdStrike Falcon FileVantage app
+
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+:::important
+Use the [Cloud-to-Cloud Integration for CrowdStrike Falcon FileVantage](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/crowdstrike-filevantage-source/) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your CrowdStrike Falcon FileVantage app is properly integrated and configured to collect and analyze your CrowdStrike Falcon FileVantage data.
+:::
+
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
-
+
## Viewing CrowdStrike Falcon FileVantage dashboards
diff --git a/docs/integrations/saas-cloud/kandji.md b/docs/integrations/saas-cloud/kandji.md
index 1e020539f2..0e39e4e13b 100644
--- a/docs/integrations/saas-cloud/kandji.md
+++ b/docs/integrations/saas-cloud/kandji.md
@@ -293,15 +293,33 @@ _sourceCategory="Labs/kandji" details
| sort by frequency,action_type
```
-## Set up collection
+## Collection configuration and app installation
-To set up the [Kandji Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/kandji-source) for the Kandji app, follow the instructions provided. These instructions will guide you through the process of creating a source using the Kandji Source category, which you will need to use when installing the app. By following these steps, you can ensure that your Kandji app is properly integrated and configured to collect and analyze your Kandji data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the Kandji app
+
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+:::important
+Use the [Cloud-to-Cloud Integration for Kandji](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/kandji-source) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Kandji app is properly integrated and configured to collect and analyze your Kandji data.
+:::
+
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
-
+
## Viewing Kandji dashboards
diff --git a/docs/integrations/saas-cloud/palo-alto-cortex-xdr.md b/docs/integrations/saas-cloud/palo-alto-cortex-xdr.md
index ab07a52cd1..b7d370844e 100644
--- a/docs/integrations/saas-cloud/palo-alto-cortex-xdr.md
+++ b/docs/integrations/saas-cloud/palo-alto-cortex-xdr.md
@@ -160,17 +160,33 @@ _sourceCategory="palo_alto_cortex_xdr" "incident_id" "incident_name"
```
-## Set up collection
+## Collection configuration and app installation
-Prior to installing the Palo Alto Cortex XDR app, you'll first need to set up the source by following the instructions provided at [Cloud-to-Cloud Integration Palo Alto Cortex XDR Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/palo-alto-cortex-xdr-source). By following these steps, you can ensure that your Palo Alto Cortex XDR app is properly integrated and configured to collect and analyze your Palo Alto Cortex XDR data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the Palo Alto Cortex XDR app
+
-This section has instructions for installing the Sumo Logic app for Palo Alto Cortex XDR.
+:::important
+Use the [Cloud-to-Cloud Integration for Palo Alto Cortex XDR](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/palo-alto-cortex-xdr-source) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Palo Alto Cortex XDR app is properly integrated and configured to collect and analyze your Palo Alto Cortex XDR data.
+:::
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+### Create a new collector and install the app
-
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
+
+
## Viewing Palo Alto Cortex XDR dashboards
diff --git a/docs/integrations/saas-cloud/symantec-web-security-service.md b/docs/integrations/saas-cloud/symantec-web-security-service.md
index aa0024ee17..16f8d050f2 100644
--- a/docs/integrations/saas-cloud/symantec-web-security-service.md
+++ b/docs/integrations/saas-cloud/symantec-web-security-service.md
@@ -61,15 +61,33 @@ _sourceCategory=swssDev
| count_distinct(id)
```
-## Set up collection
+## Collection configuration and app installation
-To set up [Cloud-to-Cloud Integration Symantec Web Security Service Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/symantec-web-security-service-source/) for the Symantec Web Security Service App, follow the instructions provided. These instructions will guide you through the process of creating a source using the Symantec Web Security Service Source category, which you will need to use when installing the app. By following these steps, you can ensure that your Symantec Web Security Service app is properly integrated and configured to collect and analyze your Symantec Web Security Service data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the Symantec Web Security Service app
+
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+:::important
+Use the [Cloud-to-Cloud Integration for Symantec Web Security Service](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/symantec-web-security-service-source) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Symantec Web Security Service app is properly integrated and configured to collect and analyze your Symantec Web Security Service data.
+:::
-
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
+
+
## Viewing Symantec Web Security Service dashboards
diff --git a/docs/integrations/saas-cloud/trend-micro-vision-one.md b/docs/integrations/saas-cloud/trend-micro-vision-one.md
index 67f6a4c4d7..dc56c67b75 100644
--- a/docs/integrations/saas-cloud/trend-micro-vision-one.md
+++ b/docs/integrations/saas-cloud/trend-micro-vision-one.md
@@ -222,15 +222,33 @@ _sourceCategory="Labs/TrendMicroVisionOne"
| count
```
-## Set up collection
+## Collection configuration and app installation
-To set up the [Trend Micro Vision One Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/trend-micro-source) for the Trend Micro Vision One app, follow the instructions provided. These instructions will guide you through the process of creating a source using the Trend Micro Vision One Source category, which you will need to use when installing the app. By following these steps, you can ensure that your Trend Micro Vision One app is properly integrated and configured to collect and analyze your Alerts data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the Trend Micro Vision One app
+
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+:::important
+Use the [Cloud-to-Cloud Integration for Trend Micro Vision One](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/trend-micro-source) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Trend Micro Vision One app is properly integrated and configured to collect and analyze your Trend Micro Vision One data.
+:::
+
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
-
+
## Viewing the Trend Micro Vision One dashboards
diff --git a/docs/integrations/saas-cloud/webex.md b/docs/integrations/saas-cloud/webex.md
index 7329ebf5aa..36fcae3d71 100644
--- a/docs/integrations/saas-cloud/webex.md
+++ b/docs/integrations/saas-cloud/webex.md
@@ -78,17 +78,33 @@ _sourceCategory="cisco_webex"
| count
```
-## Set up collection
+## Collection configuration and app installation
-To set up the Webex Cloud-to-Cloud Integration for the Webex app, follow the instructions provided [here](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/webex-source/). These instructions will guide you through the process of creating a source using the Webex Source category, which you will need to use when installing the app. By following these steps, you can ensure that your Webex app is properly integrated and configured to collect and analyze your Webex data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the Webex app
+
-This section has instructions for installing the Webex app for Sumo Logic and descriptions of each of the dashboards.
+:::important
+Use the [Cloud-to-Cloud Integration for Webex](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/webex-source/) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Webex app is properly integrated and configured to collect and analyze your Webex data.
+:::
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+### Create a new collector and install the app
-
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
+
+
## Viewing Webex dashboards
diff --git a/docs/integrations/security-threat-detection/carbon-black-cloud.md b/docs/integrations/security-threat-detection/carbon-black-cloud.md
index 26b95c9464..a188fdfac9 100644
--- a/docs/integrations/security-threat-detection/carbon-black-cloud.md
+++ b/docs/integrations/security-threat-detection/carbon-black-cloud.md
@@ -59,15 +59,33 @@ _sourceCategory=CBCloud WATCHLIST
| count
```
-## Set up collection
+## Collection configuration and app installation
-To set up [Cloud-to-Cloud Integration Carbon Black Cloud Source](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/carbon-black-cloud-source) for the Carbon Black Cloud app, follow the instructions provided. These instructions will guide you through the process of creating a source using the Carbon Black Cloud source category, which you will need to use when installing the app. By following these steps, you can ensure that your Carbon Black Cloud app is properly integrated and configured to collect and analyze your Carbon Black Cloud data.
+import CollectionConfiguration from '../../reuse/apps/collection-configuration.md';
-## Installing the Carbon Black Cloud app
+
-import AppInstall2 from '../../reuse/apps/app-install-v2.md';
+:::important
+Use the [Cloud-to-Cloud Integration for Carbon Black Cloud](/docs/send-data/hosted-collectors/cloud-to-cloud-integration-framework/carbon-black-cloud-source) to create the source and use the same source category while installing the app. By following these steps, you can ensure that your Carbon Black Cloud app is properly integrated and configured to collect and analyze your Carbon Black Cloud data.
+:::
-
+### Create a new collector and install the app
+
+import AppCollectionOPtion1 from '../../reuse/apps/app-collection-option-1.md';
+
+
+
+### Use an existing collector and install the app
+
+import AppCollectionOPtion2 from '../../reuse/apps/app-collection-option-2.md';
+
+
+
+### Use an existing source and install the app
+
+import AppCollectionOPtion3 from '../../reuse/apps/app-collection-option-3.md';
+
+
## Viewing Carbon Black Cloud dashboards
diff --git a/static/img/integrations/google/Google-Bigquery-Overview.png b/static/img/integrations/google/Google-Bigquery-Overview.png
index 70dfc69e5e..15aa2378d5 100644
Binary files a/static/img/integrations/google/Google-Bigquery-Overview.png and b/static/img/integrations/google/Google-Bigquery-Overview.png differ
diff --git a/static/img/integrations/google/Google-Bigquery-Queries.png b/static/img/integrations/google/Google-Bigquery-Queries.png
index 9052966e14..cce3850dee 100644
Binary files a/static/img/integrations/google/Google-Bigquery-Queries.png and b/static/img/integrations/google/Google-Bigquery-Queries.png differ
diff --git a/static/img/integrations/google/Google-Bigquery-Users.png b/static/img/integrations/google/Google-Bigquery-Users.png
index 10306bd9df..be9efcc8dd 100644
Binary files a/static/img/integrations/google/Google-Bigquery-Users.png and b/static/img/integrations/google/Google-Bigquery-Users.png differ
diff --git a/static/img/integrations/google/Google-Bigquery-mgmt.png b/static/img/integrations/google/Google-Bigquery-mgmt.png
index 9cda29bd8c..1057013602 100644
Binary files a/static/img/integrations/google/Google-Bigquery-mgmt.png and b/static/img/integrations/google/Google-Bigquery-mgmt.png differ