diff --git a/docs/cse/automation/cloud-siem-automation-examples.md b/docs/cse/automation/cloud-siem-automation-examples.md
index 75e7dfbd35..3f1f052e74 100644
--- a/docs/cse/automation/cloud-siem-automation-examples.md
+++ b/docs/cse/automation/cloud-siem-automation-examples.md
@@ -153,7 +153,7 @@ The following example shows how to configure a notification that sends an email
## Advanced example: Configure a custom integration
-The following example shows how to create a custom integration with an action that runs a script you provide. The custom integration and action are defined by YAML files. To learn how to build your own YAML files, see [Integration framework file formats](/docs/platform-services/automation-service/automation-service-integration-framework/#integration-framework-file-formats).
+The following example shows how to create a custom integration with an action that runs a script you provide. The custom integration and action are defined by YAML files. To learn how to build your own YAML files, see [Integration framework file formats](/docs/platform-services/automation-service/integration-framework/about-integration-framework/#integration-framework-file-formats).
The action uses [IP Quality Score](https://www.ipqualityscore.com/) to gather IP reputation information for enrichment. (This example shows how to add enrichment to an insight. To use the same action to add enrichment to entities, see [Add entity enrichment](#add-entity-enrichment) below.)
diff --git a/docs/platform-services/automation-service/automation-service-audit-logging.md b/docs/platform-services/automation-service/automation-service-audit-logging.md
index 604b622fb1..b65cb0d462 100644
--- a/docs/platform-services/automation-service/automation-service-audit-logging.md
+++ b/docs/platform-services/automation-service/automation-service-audit-logging.md
@@ -59,7 +59,7 @@ The table below shows the `_sourceCategory` that is assigned to Audit Event Inde
| [Automation action](/docs/platform-services/automation-service/automation-service-playbooks/#add-an-action-node-to-a-playbook) | `oarAutomationActions` |
| [Automation action configuration](/docs/platform-services/automation-service/automation-service-playbooks/#add-an-action-node-to-a-playbook) | `oarAutomationActionConfigurations` |
| [Integration](/docs/platform-services/automation-service/automation-service-integrations/) | `oarIntegrations` |
-| [Integration resource](/docs/platform-services/automation-service/automation-service-integration-framework/) | `oarIntegrationResources` |
+| [Integration resource](/docs/platform-services/automation-service/integration-framework/) | `oarIntegrationResources` |
| [Playbook execution](/docs/platform-services/automation-service/automation-service-playbooks/) | `oarPlaybookExecutions` |
| [Playbook revision](/docs/platform-services/automation-service/automation-service-playbooks/) | `oarPlaybookRevisions` |
@@ -70,7 +70,7 @@ The table below shows the `_sourceCategory` that is assigned to Audit Event Inde
| Product Feature | _sourceCategory Value |
|:--|:--|
| [Custom Field](/docs/cloud-soar/overview/#custom-fields) | `oarCustomFields` |
-| [Daemon](/docs/platform-services/automation-service/automation-service-integration-framework/#daemon-action-definitions) | `oarDaemons` |
+| [Daemon](/docs/platform-services/automation-service/integration-framework/about-integration-framework/#daemon-action-definitions) | `oarDaemons` |
| [Dashboard](/docs/cloud-soar/incidents-triage/#dashboards) | `oarDashboards` |
| Email | `oarEmails` |
| [Entity](/docs/cloud-soar/incidents-triage/#entities) | `oarEntities` |
diff --git a/docs/platform-services/automation-service/automation-service-integration-framework.md b/docs/platform-services/automation-service/automation-service-integration-framework.md
deleted file mode 100644
index de6a8812b1..0000000000
--- a/docs/platform-services/automation-service/automation-service-integration-framework.md
+++ /dev/null
@@ -1,1889 +0,0 @@
----
-id: automation-service-integration-framework
-title: Integration Framework
-sidebar_label: Integration Framework
-description: Learn about the framework used for integrations.
----
-
-import useBaseUrl from '@docusaurus/useBaseUrl';
-
-The Automation Service allows you to develop and extend integrations using a common and open framework. This article describes the framework and provides examples.
-
-Because the Automation Service is a subset of automation capabilities adapted from Cloud SOAR, some of the functionality documented below is unique to [Cloud SOAR automation](/docs/cloud-soar/automation/).
-
-## Integration file hierarchy
-
-Integrations are defined using two types of text files. The first type, the integration definition file, is used to define the properties of the product which the integration connects. This includes information such as the name, logo, connection parameters, test code, and the Docker container used to execute the actions. One integration definition file is required for each integration and serves as a container for all the actions that the integration will perform.
-
-The second type of file is an action definition file, which is used to define a single action that will be performed using the integration. Each integration action is defined in a separate action definition file, which will be associated with the appropriate integration definition. Action definition files are the files which contain the actual code which will be executed to perform the action. Supported languages include Perl, Python, PowerShell, and Bash. In addition to the action code, action definition files also contain information such as the name, required and optional fields, and the format in which the resulting information will be displayed.
-
-The following diagram shows the integration file hierarchy:
-
-
-
-Defining integrations at the action level allows users have greater flexibility in customizing existing integrations and sharing new actions with other users. For example, you may choose to extend the existing RSA NetWitness integration to include an additional action which retrieves all network connections for a given host. Once you create this new action, you can easily add it to the existing RSA Netwitness integration by uploading the new integration action file.
-
-You can also share this new action and use it to extend the functionality of the integration for others. The following diagram shows action file portability:
-
-
-
-## Integration framework file formats
-
-Both the integration definition file and the action definition file are YAML files. The following sections highlight the formats for each file type. [Example files](#example-files) below contains samples of completed integration definition and action definition files as a reference. To see YAML files used in a working integration, see an example for Cloud SIEM in [Advanced example: Configure a custom integration](/docs/cse/automation/cloud-siem-automation-examples/#advanced-example-configure-a-custom-integration).
-
-### Integration definition file format
-
-`*` Required fields
-
-* **name** `*` [String]: Name displayed in the UI. It must match the `integration` field of each action definition file added to the integration.
-* **official_name** `*` [String] (Cloud SOAR only): To modify the display name of an integration in the Cloud SOAR UI while ensuring the actions YAML remains valid, set `official_name=OLD-NAME` and `name=NEW-NAME`.
-* **version** `*` [String]: File version number.
-* **icon** `*` [Base64 String]: Integration logo.
-* **script** `*`:
- * **type** `*` [String]: Indicates which code parser should be used to execute the code within the integration and action definition files. All action definition files for the integration must use the same code language as defined in the integration definition file. Acceptable values are:
- * `bash`
- * `perl`
- * `powershell`
- * `python`
- * **test_connection_code** `*` [String]: Code which can be used to test the integration through the UI by clicking on Test Saved Settings. Exiting with a value of `0` indicates success, while any other value will indicate failure.
-* **docker_repo_tag** `*` [String]: Docker repository tag of the image build the new container is from. Can be from any local or remote repository configured on the server.
-* **local_repo** [Boolean] (Cloud SOAR only): Indicates that the Docker image is a local one and not one present in the repository.
-* **configuration** `*`:
- * **testable_connection** `*` [Boolean]: Is test code present (true/false).
- * **require_proxy_config** `*` [Boolean]: True/false value indicating whether a proxy configuration tab should be available in the UI for the integration. If the value is set to true and a proxy is configured in the UI, the parameter `proxy_url` will be passed to the code on execution as an environment variable.
- * **data_attributes** `*`: Fields required for configuration.
- * **`
-
-The JSON view will display the entire output of the integration action in JSON format:
-
-
-
-Following is the setting for a link type:
-
-```
-- display_name: 'CVSS'
-value : 'cvss'
-type : 'link'
-```
-
-
-
-### Added more output type for action
-
-It's possible to specify a JSON path or use rawOutput to specify text output to use as `srcDoc` for iframe sandbox (it is not possible to use JavaScript):
-
-```
-integration: 'Incident tools'
-name: 'intervallo date 3ededed'
-type: Enrichment
-script:
- code: |
- [....]
- art = '''
-
-
-
-
-
-
-The `image_base64_png(jpg)` field provides the result path where to get base64 png or jpg image, for example:
-
-
-
-## Working with integrations
-
-All integrations are configured by navigating to **Integrations** in the UI.
-
-### Integration definitions
-
-To add a new integration, click on the **+** icon on the **Integrations** page.
-
-
-
-The **New Integration** window allows you to upload an integration definition file by clicking **Select File**. Once you define the integration definition file, click **Save** to add the new integration.
-
-
-
-To edit an existing integration by uploading a new integration definition file , click on the **Edit** button. To export the integration definition file for the selected integration, click the **Export** icon.
-
-### Action definitions
-
-To add a new action, select the appropriate integration from the integrations list, then click on the **Upload** button to the right of the integration.
-
-The **New Action** window allows you to upload an action definition file by clicking **Select File**, and lets you select the kind of action.
-
-
-
-Once the action definition file has been selected, click **Save** to add the new action.
-
-Existing actions may be edited by clicking the **Upload** button below the action name to upload a new action definition file, or by clicking the **Edit** button below the action name to open a text editor and edit the action directly.
-
-
-
-To test an action, click on the **Test Action** button below the action name.
-
-
-
-Enter the required parameters and click **Test Action**.
-
-
-
-To export an action, click on the **Export** button below the action name.
-
-### Action definitions for Cloud SOAR
-
-The following action definitions are for Cloud SOAR only.
-
-#### Daemon action definitions
-
-Uploading an action YAML file with type Daemon allows you to specify Daemon action. You can also define rules associated with Daemon.
-
-
-
-Daemon action must return an array of objects in JSON format:
-
-```
-[{ 'a': 'a1', 'b': 'b1' }, { 'a': 'a2', 'b': 'b2' }]
-```
-
-Every object is processed by filter and action. It is also possible to define which output field should be passed to the next script run and an extra param key value pair to specialize each rule:
-
-
-
-All available actions are:
-* Create incident from template
-* Update incident
-* Close Incident
-* Change incident status
-* Add events to an existing incident
-* Change task progress
-* Close task
-* Add to Triage
-
-#### Scheduled action definitions
-
-A _Scheduled action_ represents a particular type of action when the execution is iterated until a specific exit condition is met. This type of action permits you to create loops in a playbook.
-
-YAML example:
-
-```
- integration: 'Incident tools'
- name: 'intervallo date loop'
- type: Scheduled
- script:
- code: |
- [......]
- exit_condition:
- - path: 'exit_condition'
- string: 'false'
- re-execution: 'force'
- scheduled:
- - every: '10s'
- expire: '120s'
- output:
- - path : 'exit_condition'
-```
-
-Or using strings array:
-
-```
- integration: 'Incident tools'
- name: 'intervallo date loop'
- type: Scheduled
- script:
- code: |
- [......]
- exit_condition:
- - path: 'exit_condition'
- string:
- - 'Open'
- - 'Pending'
- - 'Waiting'
- re-execution: 'force'
- scheduled:
- - every: '10s'
- expire: '120s'
- output:
- - path : 'exit_condition'
-```
-
-Or using action's input:
-
-:::note
-If you use an action's input, this input field should be `required = true*`.
-:::
-
-```
-integration: 'Testing Purpose'
-name: 'testing Scheduled'
-type: Scheduled
-script:
- code: |
- [...]
-exit_condition:
- - path: 'input.exit_condition_path'
- string: "input.exit_condition_string"
-scheduled:
- - every: 'input.scheduler_every'
- expire: 'input.scheduler_expire'
-fields:
- - id: scheduler_every
- label: 'scheduler rate'
- type: text
- required: true
- hint: "schedule rate i.e 1m 5m 1d (supported placeholder m=minutes, h=hours, d=days)"
- - id: scheduler_expire
- label: 'schedule expiration'
- type: text
- required: true
- hint: "schedule expiration i.e 1m 5m 1d (supported placeholder m=minutes, h=hours, d=days)"
- - id: exit_condition_path
- label: 'output path'
- type: text
- required: true
- hint: "output path to check"
- - id: exit_condition_string
- label: 'string to check'
- type: tag
- required: true
- hint: "string to check"
-output:
- - path : '[]."ip-dst_string"'
- - path : '[].{Name: name, ID: _id, Address: address, FriendName: friends.[].name}'
- - path : '[].tags.[] | unique()'
- - path : '[].tags.[]'
- - path : '[].guid | join(,)'
- - path : '[].guid | join(SEPARATOR)'
- - path : '[].guid'
- - path : '[]._id'
- - path : '[].guid'
- - path : '[].isActive'
- - path : '[].balance'
- - path : '[].picture'
- - path : '[].eyeColor'
- - path : '[].name'
- - path : '[].age'
- - path : '[].gender'
- - path : '[].company'
- - path : '[].email'
- - path : '[].phone'
- - path : '[].address'
- - path : '[].friends'
-```
-
-Field notes:
-* **re-execution**
-
-Depending on the logic that you want to implement in your triggers, specify a list of one or more [hooks](#trigger-hooks) in the trigger YAML file. Each hook represents a manual event or API endpoint that can invoke the trigger. For example, by specifying the hook `updateIncident` inside a trigger, the trigger will fire whenever the field of any incident is updated either manually from the UI or via the API.
-
-Triggers function as individual actions, executed in the backend, without the capability to review the execution output in the GUI except for triggers on entities (observables). If a trigger fails, error logs printed on the `stderr` output of the trigger are exported in the audit trail (system log verbosity must be set to `ALL` to review trigger audit logs). Triggers cannot receive manual input, except for [triggers with the `incidentCustomActions` and `taskCustomActions`hooks](#trigger-incidentcustomaction-and-taskcustomaction) that accept a text input.
-
-##### Examples of trigger definition files
-
-See the following examples of trigger definition files:
-* [Trigger definition file (Incident Tools)](/docs/platform-services/automation-service/automation-service-integration-framework/#trigger-definition-file-incident-tools)
-* [Trigger taskCustomAction definition file (Incident Tools)](/docs/platform-services/automation-service/automation-service-integration-framework/#trigger-taskcustomaction-definition-file-incident-tools)
-* [Trigger incidentCustomAction definition file (Incident Tools)](/docs/platform-services/automation-service/automation-service-integration-framework/#trigger-incidentcustomaction-definition-file-incident-tools)
-* [Trigger webhook definition file](/docs/platform-services/automation-service/automation-service-integration-framework/#trigger-webhook-definition-file)
-
-##### Trigger hooks
-
-Specify `hook` values in a `Trigger` type [action definition file](/docs/platform-services/automation-service/automation-service-integration-framework/#action-definition-file-format) to run the trigger action in specific situations. For example, to automatically run a trigger action when a task is closed, specify the `closeTask` hook.
-
-The following sections describe the valid hook values to use in a trigger definition file.
-
-##### Entities hooks
-
-Following are the hooks for [entities](/docs/cloud-soar/incidents-triage/#entities) (observables) events that run when objects are created:
-* `addObservableArtifact`. When an artifact entity is created.
-* `addObservableDomain`. When a domain entity is created.
-* `addObservableIp`. When an IP address entity is created.
-* `addObservableMail`. When an email entity is created.
-* `addObservableUrl`. When a URL entity is created.
-* `addObservableUserDetail`. When a user detail entity is created.
-
-##### Task hooks
-
-Following are the hooks for [task](/docs/cloud-soar/incidents-triage/#tasks) events:
-* `approveTask`. When task is approved. Param passed to script `tasksDetail`.
-* `closeTask`. When task is closed. Param passed to script `tasksDetail`.
-* `createTask`. When task is created. Param passed to script `tasksDetail`.
-* `reassignTask`. When task is reassigned. Param passed to script `tasksDetail`.
-* `taskCustomAction`. Custom trigger. Param passed to script `text`. For more information, see [Trigger incidentCustomAction and taskCustomAction](/docs/platform-services/automation-service/automation-service-integration-framework/#trigger-incidentcustomaction-and-taskcustomaction).
-* `updateTask`. When task is updated. Param is passed to scripts `tasksBeforeUpdate` and `tasksAfterUpdate`.
-
-Params `tasksDetail`, `tasksBeforeUpdate`, and `tasksAfterUpdate` are JSON strings with the form:
-```json
-{
-reminder_time:
-
-Clicking the button in the UI runs the trigger:
-
-
-
-When users interact with the custom action trigger button, they can provide a textual input that can be elaborated by the trigger. To elaborate the input of a custom action trigger, use the `text` param inside the code:
-
-```json
-integration: 'Incident Tools'
-name: 'Custom trigger button'
-type: Trigger
-show_modal: true
-script:
- code: |
- import json
- import argparse
- import requests
- import sys
- parser = argparse.ArgumentParser()
- parser.add_argument('--incidentsDetail', help='incident before update', required=False) #param inherited by
- hook defined in the yaml
- parser.add_argument('--token', help='JWT token , REQUIRED', required=True)
- parser.add_argument('--incmanurl', help='IncMan URL , REQUIRED', required=True)
- parser.add_argument('--text', help='text', required=False) #param inherited by hook defined in the yaml
- args, unknown = parser.parse_known_args()
- inc_det_after = json.loads(args.incidentsDetail)
- incidentID = inc_det_after.get('id')
- headers = {
- 'Accept': 'application/json;charset=UTF-8',
- 'Content-Type': 'application/x-www-form-urlencoded',
- 'Authorization': 'Bearer ' + args.token
- }
- end_point = '{incmanurl}/api/v2/incidents/{incidentid}'.format(incmanurl=args.incmanurl, incidentid=incidentID)
- session = requests.Session()
- session.verify = False
- additional_info = inc_det_after.get('additional_info')
- new_text = "
-
-In the following example, a trigger ingests the JSON payload posted to the webhook endpoint, and writes its content in the description widget of a specific incident (ID 1743):
-
-```json
-type: Trigger
-script:
- code: |
- import json
- import argparse
- from datetime import datetime
- import sys
- import requests
- import time
- parser = argparse.ArgumentParser()
- parser.add_argument('--payload', help='WebHook payload , REQUIRED', required=True)
- parser.add_argument('--token', help='JWT token , REQUIRED', required=True)
- parser.add_argument('--incmanurl', help='IncMan URL , REQUIRED', required=True)
- args, unknown = parser.parse_known_args()
- payload = json.dumps(args.payload)
- incidentID = 1743
- headers = {
- 'Accept': 'application/json;charset=UTF-8',
- 'Content-Type': 'application/x-www-form-urlencoded',
- 'Authorization': 'Bearer ' + args.token
- }
- end_point = '{incmanurl}/api/v2/incidents/{incidentid}'.format(incmanurl=args.incmanurl,
-incidentid=incidentID)
- session = requests.Session()
- session.verify = False
- additional_info = json.loads(payload)
- payload = {
- "additional_info": additional_info,
- }
- incident = session.put(end_point, headers=headers, data=payload, proxies=None, timeout=(5,60))
- sys.stderr.write(str(incident.content))
- try:
- incident.raise_for_status()
- except Exception as e:
- sys.stderr.write("Error updating incident Severity: ")
- sys.stderr.write(str(e))
- # sys.stderr.write(str(json.dumps(args.triage_eventsDetail)))
- exit(0)
-hook:
- - webhook
-```
-
-For another example YAML file of a webhook trigger, see [Trigger webhook definition file](/docs/platform-services/automation-service/automation-service-integration-framework/#trigger-webhook-definition-file).
-
-## Example files
-
-Following are example definition and action files for integrations. To see an explanation of the file contents, see [Integration framework file formats](#integration-framework-file-formats) above. To see integration definition and action files used in a working integration for Cloud SIEM, see [Advanced example: Configure a custom integration](/docs/cse/automation/cloud-siem-automation-examples/#advanced-example-configure-a-custom-integration).
-
-### Integration definition file (VirusTotal)
-
-```
-name: 'VirusTotal'
-version: '1.0'
-icon:data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAAA8RnWXAAAABmJLR0...[snip...]QMq1BbQK47AAAAAASUVORK5CYII=
-script:
- type: python
- test_connection_code: |
- import json
- import argparse
- import requests
- import sys
- try:
-
- class EnvDefault(argparse.Action):
- def __init__(self, required=True, default=None, **kwargs):
- envvar = kwargs.get("dest")
- default = os.environ.get(envvar, default) if envvar in os.environ else default
- required = False if required and default else required
- super(EnvDefault, self).__init__(default=default, required=required,**kwargs)
- def __call__(self, parser, namespace, values, option_string=None):
- setattr(namespace, self.dest, values)
-
- parser = argparse.ArgumentParser()
- parser.add_argument('--api_key', help='api_key , REQUIRED', required=True, action=EnvDefault)
- parser.add_argument('--proxy_url', help='proxy_url', required=False, action=EnvDefault)
- args, unknown = parser.parse_known_args()
- params = {"apikey": args.api_key, 'url': 'google.com'}
- end_point = "https://www.virustotal.com/vtapi/v2/url/scan"
- session = requests.Session()
- if args.proxy_url is not None:
- proxies = {'http': args.proxy_url, 'https': args.proxy_url}
- else:
- proxies = None
- r = session.post(end_point, data=params, proxies=proxies, timeout=(5, 60))
- r.raise_for_status()
- exit(0)
- except Exception as e:
- sys.stderr.write(str(e))
- exit(-1)
-docker_repo_tag: 'virustotal:latest'
-configuration:
- testable_connection: true
- require_proxy_config: true
- data_attributes:
- api_key:
- label: 'api key'
- type: 'password'
- required: true
-```
-
-### Action definition file (VirusTotal)
-
-```
-integration: 'VirusTotal Open Framework CS'
-name: 'scan file'
-type: Enrichment
-script:
-code: |
- import json
- import argparse
- import virustotal from os import listdir
- import subprocess
- import os from os.path import isfile, join
- try:
-
- class EnvDefault(argparse.Action):
- def __init__(self, required=True, default=None, **kwargs):
- envvar = kwargs.get("dest")
- default = os.environ.get(envvar, default) if envvar in os.environ else default
- required = False if required and default else required
- super(EnvDefault, self).__init__(default=default, required=required,**kwargs)
- def __call__(self, parser, namespace, values, option_string=None):
- setattr(namespace, self.dest, values)
- parser = argparse.ArgumentParser()
- parser.add_argument('--api_key', help='api_key , REQUIRED', required=True, action=EnvDefault)
- parser.add_argument('--filename', help='filename , REQUIRED', required=True, action=EnvDefault)
- args, unknown = parser.parse_known_args()
- v = virustotal.VirusTotal(args.api_key)
- report = v.scan(args.filename)
- report.join()
- assert report.done == True
- result = {
- 'Resource UID': report.id,
- 'Scan UID': report.scan_id,
- 'Permalink': report.permalink,
- 'Resource SHA1': report.sha1,
- 'Resource SHA256': report.sha256,
- 'Resource MD5': report.md5,
- 'Resource status': report.status,
- 'Antivirus total': report.total,
- 'Antivirus positives': report.positives,
- 'Malware': []
- }
- for antivirus, malware in report:
- if malware is not None:
- malware_obj = {'Antivirus': antivirus[0], 'Antivirus version': antivirus[1], 'Antivirus update': antivirus[2], 'Malware': malware}
- result['Malware'].append(malware_obj)
- print(json.dumps({'filepath': ['file1', 'file2'],'report': [result]}))
- exit(0)
- except Exception as e:
- sys.stderr.write(str(e))
- exit(1)
-fields:
- - id: filename
- label: 'file to scan'
- type: fileDetonate
- required: true
- incident_artifacts: true
- observables: 'file'
-output:
- - path : 'filepath.[]'
- type : text
- - path : 'report.[].Antivirus positives'
- type : integer
- - path : 'report.[].Antivirus total'
- type : text
- - path : 'report.[].Permalink'
- type : text
- - path : 'report.[].Resource SHA256'
- type : text
-table_view:
- - display_name: 'filepath'
- value : 'filepath'
- - display_name: 'SHA256'
- value : 'report.[].Resource SHA256'
- - display_name: 'MD5'
- value : 'report.[].Resource MD5'
- - display_name: 'Malware'
- value : 'report.[].Malware.[].Malware'
- - display_name: 'Antivirus'
- value : 'report.[].Malware.[].Antivirus'
-```
-
-## Example files for Cloud SOAR
-
-The following example files are for Cloud SOAR only.
-
-### Integration definition file to change integration name from VirusTotal to VirusTotalNew
-
-```
-name: 'VirusTotalNew'
-official_name: 'VirusTotal'
-version: '1.0'
-icon:data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAAA8RnWXAAAABmJLR0...[snip...]QMq1BbQK47AAAAAASUVORK5CYII=
-script:
- type: python
- test_connection_code: |
- import json
- import argparse
- import requests
- import sys
- try:
-
- class EnvDefault(argparse.Action):
- def __init__(self, required=True, default=None, **kwargs):
- envvar = kwargs.get("dest")
- default = os.environ.get(envvar, default) if envvar in os.environ else default
- required = False if required and default else required
- super(EnvDefault, self).__init__(default=default, required=required,**kwargs)
- def __call__(self, parser, namespace, values, option_string=None):
- setattr(namespace, self.dest, values)
-
- parser = argparse.ArgumentParser()
- parser.add_argument('--api_key', help='api_key , REQUIRED', required=True, action=EnvDefault)
- parser.add_argument('--proxy_url', help='proxy_url', required=False, action=EnvDefault)
- args, unknown = parser.parse_known_args()
- params = {"apikey": args.api_key, 'url': 'google.com'}
- end_point = "https://www.virustotal.com/vtapi/v2/url/scan"
- session = requests.Session()
- if args.proxy_url is not None:
- proxies = {'http': args.proxy_url, 'https': args.proxy_url}
- else:
- proxies = None
- r = session.post(end_point, data=params, proxies=proxies, timeout=(5, 60))
- r.raise_for_status()
- exit(0)
- except Exception as e:
- sys.stderr.write(str(e))
- exit(-1)
-docker_repo_tag: 'virustotal:latest'
-configuration:
- testable_connection: true
- require_proxy_config: true
- data_attributes:
- api_key:
- label: 'api key'
- type: 'password'
- required: true
-```
-
-### Daemon definition file (QRadar)
-
-```
-integration: 'IBM QRadar OIF'
-name: 'Get Offenses Daemon'
-type: Daemon
-script:
-code: |
- import argparse
- import base64
- import json
- import sys
- import requests
- import warnings
- from requests.packages.urllib3.exceptions import InsecureRequestWarning
- import traceback
- warnings.simplefilter('ignore', InsecureRequestWarning)
- class EnvDefault(argparse.Action):
- def __init__(self, required=True, default=None, **kwargs):
- envvar = kwargs.get("dest")
- default = os.environ.get(envvar, default) if envvar in os.environ else default
- required = False if required and default else required
- super(EnvDefault, self).__init__(default=default, required=required,**kwargs)
- def __call__(self, parser, namespace, values, option_string=None):
- setattr(namespace, self.dest, values)
- parser = argparse.ArgumentParser()
- parser.add_argument('--url', help='URL , REQUIRED', required=True, action=EnvDefault)
- parser.add_argument('--authMethod', help='Authentication method ,
- REQUIRED', required=True, action=EnvDefault)
- parser.add_argument('--validateSSL', help='validateSSL , REQUIRED',
- required=True, action=EnvDefault)
- parser.add_argument('--id', help='last offense id', required=False, action=EnvDefault)
- parser.add_argument('--username', help='username', required=False, action=EnvDefault)
- parser.add_argument('--password', help='password', required=False, action=EnvDefault)
- parser.add_argument('--token', help='token', required=False, action=EnvDefault)
- parser.add_argument('--proxy_url', help='proxy url',
- required=False, action=EnvDefault)
- args, unknown = parser.parse_known_args()
- max_destination_ip_to_get = 10
- host = str(args.url) + '/api/siem/offenses'
- if args.id:
- host += "?filter=id>" + args.id
- else:
- host += "?filter=id>145366"
- host += '&status!=CLOSED'
- header = {
- 'Version': '5.0',
- 'Accept': 'application/json',
- 'Content-Type': 'application/json'
- }
- if args.authMethod == "credentials":
- base64byte = base64.b64encode(bytes(args.username + ":" + args.password, 'utf-8'))
- credential = base64byte.decode("utf-8")
- header['Authorization'] = 'Basic ' + credential
- else:
- header['SEC'] = args.token
- verifySSL = args.validateSSL == "true"
- proxies = {'http': args.proxy_url, 'https': args.proxy_url} if args.proxy_url is not None else None
- try:
- s = requests.Session()
- r = s.get(url=host, headers=header, verify=verifySSL, proxies=proxies, timeout=(5, 60))
- r.raise_for_status()
- json_data = json.loads(r.text)
- new_array = []
- for event in json_data:
- ariel_search_utl = args.url + '/api/siem/source_addresses/'
- event['source_address_ip'] = []
- for source_ip_id in event['source_address_ids']:
- search_ip = ariel_search_utl + str(source_ip_id) + '?fields=source_ip'
- try:
- request_post = s.get(url=search_ip, headers=header, verify=verifySSL, proxies=proxies, timeout=(5, 60))
- request_post.raise_for_status()
- json_data_post = json.loads(request_post.text)
- event['source_address_ip'].append(json_data_post['source_ip'])
- except Exception:
- pass
- ariel_search_utl = args.url + '/api/siem/local_destination_addresses/'
- event['local_destination_ip'] = []
- for destination_ip_id in event['local_destination_address_ids'][:max_destination_ip_to_get]:
- search_ip = ariel_search_utl + str(destination_ip_id) + '?fields=local_destination_ip'
- try:
- request_post = s.get(url=search_ip, headers=header, verify=verifySSL, proxies=proxies, timeout=(5, 60))
- request_post.raise_for_status()
- json_data_post = json.loads(request_post.text)
- event['local_destination_ip'].append(json_data_post['local_destination_ip'])
- except Exception:
- pass
- new_array.append(event)
- print(json.dumps(new_array))
- exit(0)
- except Exception as e:
- sys.stderr.write(str(e))
- exit(-1)
-fields:
- - id: id
- label: "From offence id"
- type: text
-output:
- - path: '[].username_count'
- type: integer
- - path: '[].description'
- type: string
- - path: '[].event_count'
- type: integer
- - path: '[].flow_count'
- type: integer
- - path: '[].assigned_to'
- type: string
- - path: '[].security_category_count'
- type: integer
- - path: '[].follow_up'
- type: string
- - path: '[].source_address_ids.[]'
- type: integer
- - path: '[].source_count'
- type: integer
- - path: '[].inactive'
- type: string
- - path: '[].protected'
- type: string
- - path: '[].category_count'
- type: integer
- - path: '[].source_network'
- type: string
- - path: '[].destination_networks.[]'
- type: string
- - path: '[].closing_user'
- type: string
- - path: '[].close_time'
- type: datetime
- - path: '[].remote_destination_count'
- type: integer
- - path: '[].start_time'
- type: datetime
- - path: '[].last_updated_time'
- type: datetime
- - path: '[].credibility'
- type: integer
- - path: '[].magnitude'
- type: integer
- - path: '[].id'
- type: integer
- - path: '[].categories.[]'
- type: string
- - path: '[].severity'
- type: integer
- - path: '[].policy_category_count'
- type: integer
- - path: '[].device_count'
- type: integer
- - path: '[].closing_reason_id'
- type: string
- - path: '[].offense_type'
- type: integer
- - path: '[].relevance'
- type: integer
- - path: '[].domain_id'
- type: integer
- - path: '[].offense_source'
- type: string
- - path: '[].local_destination_address_ids.[]'
- type: integer
- - path: '[].local_destination_count'
- type: integer
- - path: '[].status'
- type: string
- - path: '[].source_address_ip'
- type: string
- - path: '[].local_destination_ip'
- type: string
-```
-
-### Trigger definition file (Incident Tools)
-
-```
-integration: 'Incident Tools'
-name: 'Change severity trigger'
-type: Trigger
-script:
-code: |
- import json
- import argparse
- from datetime import datetime
- import sys
- import requests
- import time
- class EnvDefault(argparse.Action):
- def __init__(self, required=True, default=None, **kwargs):
- envvar = kwargs.get("dest")
- default = os.environ.get(envvar, default) if envvar in os.environ else default
- required = False if required and default else required
- super(EnvDefault, self).__init__(default=default, required=required,**kwargs)
- def __call__(self, parser, namespace, values, option_string=None):
- setattr(namespace, self.dest, values)
- parser = argparse.ArgumentParser()
- parser.add_argument('--incidentsBeforeUpdate', help='incident before update', required=False, action=EnvDefault)
- parser.add_argument('--incidentsAfterUpdate', help='incident after update', required=False, action=EnvDefault)
- parser.add_argument('--token', help='JWT token , REQUIRED', required=True, action=EnvDefault)
- parser.add_argument('--cloudsoarurl', help='Cloud SOAR URL , REQUIRED', required=True, action=EnvDefault)
- args, unknown = parser.parse_known_args()
- inc_det_before = json.loads(args.incidentsBeforeUpdate)
- inc_det_after = json.loads(args.incidentsAfterUpdate)
- incidentID = inc_det_after.get('id')
- sys.stderr.write(str(json.dumps(inc_det_before)))
- sys.stderr.write(str(json.dumps(inc_det_after)))
- prio = inc_det_after.get('restriction')
- if inc_det_after.get('restriction') != inc_det_before.get('restriction'):
- headers = {
- 'Accept': 'application/json;charset=UTF-8',
- 'Content-Type': 'application/x-www-form-urlencoded',
- 'Authorization': 'Bearer ' + args.token
- }
- end_point = '{cloudsoarurl}/api/v2/incidents/{incidentid}'.format(cloudsoarurl=args.cloudsoarurl incidentid=incidentID)
- session = requests.Session()
- session.verify = False
- additional_info = inc_det_after.get('additional_info')
- additional_info += "
-1. Type a name for your custom image in the **Docker image tag** field. This is a required field.
-1. When you are creating a new custom Docker image, you will see the **Last update** field is showing **Never edited before**. The text area below allows you to write a Dockerfile with the instructions to build your custom image:
-1. Proceed to write your custom Dockerfile as you would normally do. If you need tips on how to do this, refer to [Useful Docker commands](#useful-docker-commands) or check the Docker official documentation. Keep in mind that the following statements are not currently available, which means they will be ignored when building the image: `COPY`, `WORKDIR`, `EXPOSE`, `ADD`, `ENTRYPOINT`, `USER`, `ARG`, and `STOPSIGNAL`.
-1. In the editor you will see there is a dropdown menu above the text area that reads **Valid Instructions**. This dropdown menu enumerates in a descriptive way a set of instructions that you can use in your Dockerfile. If you choose them from the dropdown menu, a new line will be added to your Dockerfile with the keyword to start the statement, so you can pick up from there. The use of this dropdown menu is completely optional and you can write your Dockerfile directly in the text area.
-1. As soon as you change something in your Dockerfile, a **Save** button will appear next to the Docker editor button. Click on it if you are ready to save your custom Dockerfile.
-
-Once you have saved a custom Dockerfile, the integration will be executed on a container built from the relative custom Docker image.
-
-### Testing your custom Docker image
-
-We strongly suggest that you test your custom images as soon as you create or modify them. If by any chance you save a faulty custom Dockerfile, when the actions from that integration are triggered, their execution will fail because the Docker image will fail as well.
-
-1. To test your custom images, click where it says **TEST IMAGE** at the bottom right corner of the editor.
-1. If your custom Docker image was built without error, a success message will pop up in your screen. Otherwise, if a proper image cannot be built from your custom Dockerfile, an error message will pop up, containing details on what went wrong. In that case, it is very important that you correct your Dockerfile and test it again until an image is built successfully. As an alternative, you can always revert to the original Docker image used by the integration, by clicking on Reset Default Image at the bottom of the editor.
-
-### Deleting your custom Docker image and reverting to the original one
-
-The **RESET DEFAULT IMAGE** button that appears at the bottom of the editor allows you to delete your custom Docker image and revert to the original Docker image used for that integration.
-
-As soon as you click on it, the integration will start being executed again in a container based on the original integration image. Notice that your custom Dockerfile will be lost, so you will have to write it again if you want to revert back to your custom image.
-
-### Checking whether an integration is using a custom Docker image
-
-If you have appropriate permissions, you can set up custom Docker images for any integration. There are many ways to check if any of your integrations are being executed in a container from a custom Docker image.
-
-If you go to the Docker YAML editor of the integration, if it is using a custom image, a comment above the `docker_repo_tag` will tell you so:
-
-
-
-However, the best way to check whether an integration is using a custom image is to open the Docker editor for that integration (by clicking on the button with the Docker logo on it).
-
-Integrations using a custom image will have a **Docker image tag**, a **Last update** date, and some Dockerfile content.
-
-On the other hand, integrations that are not using a custom Docker image will have an empty Docker editor showing **Never edited before** in the **Last update** field:
-
-
-
-### Useful Docker commands
-
-The following commands may be useful in performing some common actions in Docker on your server. See the official Docker documentation at https://docs.docker.com/ for additional information.
-
-#### Create Docker image (Python example)
-
-1. Create a file with name dockerfile inside a dedicated directory:
- ```
- FROM python:2-slim
- RUN pip install --trusted-host pypi.python.org requests suds
- ```
-1. Inside this directory run:
- ```
- docker build -t
-
-You will get a print HTML of aggregated elements
-
-
-
-## Pipe functions in YAML output
-
-With the same action used in [Use mult-select in output](#use-multi-select-in-output), it's possible to use two common pipe functions to process action output.
-
-Pipe function `join('separator')`:
-
-```
-output:
- - path : '[].guid | join(,)'
-```
-
-
-
-And so the next action will run one time with a string created join array element with `separator` specified:
-
-
-
-Pipe function `unique()`:
-```
-output:
- - path : '[].tags.[] | unique()'
-```
-
-
-
-The array will be populated with not duplicated element:
-
-
diff --git a/docs/platform-services/automation-service/automation-service-integrations.md b/docs/platform-services/automation-service/automation-service-integrations.md
index bb52dee0ec..feea339248 100644
--- a/docs/platform-services/automation-service/automation-service-integrations.md
+++ b/docs/platform-services/automation-service/automation-service-integrations.md
@@ -62,7 +62,7 @@ Note that in the following example a **(2)** follows the duplicated integration'
### In the Automation Service
-To create a new integration in the Automation Service, you must supply an integration definition YAML file, as well as an action definition YAML file for each of the actions contained in the integration. For an example of creating a new integration by supplying YAML files, see [Advanced example: Configure a custom integration](/docs/cse/automation/cloud-siem-automation-examples/#advanced-example-configure-a-custom-integration). For sample YAML files, see [Example files](/docs/platform-services/automation-service/automation-service-integration-framework/#example-files). To learn how to build your own YAML files, see [Integration framework file formats](/docs/platform-services/automation-service/automation-service-integration-framework/#integration-framework-file-formats).
+To create a new integration in the Automation Service, you must supply an integration definition YAML file, as well as an action definition YAML file for each of the actions contained in the integration. For an example of creating a new integration by supplying YAML files, see [Advanced example: Configure a custom integration](/docs/cse/automation/cloud-siem-automation-examples/#advanced-example-configure-a-custom-integration). For sample YAML files, see [example files](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/). To learn how to build your own YAML files, see [Integration framework file formats](/docs/platform-services/automation-service/integration-framework/about-integration-framework/#integration-framework-file-formats).
To create a new integration:
1. Create an integration definition YAML file, as well as an action definition YAML file for each action in the integration.
diff --git a/docs/platform-services/automation-service/integration-framework/about-integration-framework.md b/docs/platform-services/automation-service/integration-framework/about-integration-framework.md
new file mode 100644
index 0000000000..efe7894480
--- /dev/null
+++ b/docs/platform-services/automation-service/integration-framework/about-integration-framework.md
@@ -0,0 +1,758 @@
+---
+id: about-integration-framework
+title: About the Integration Framework
+sidebar_label: About
+description: Get an overview of how the integration framework works.
+---
+
+import useBaseUrl from '@docusaurus/useBaseUrl';
+
+## Integration file hierarchy
+
+Integrations are defined using two types of text files. The first type, the integration definition file, is used to define the properties of the product which the integration connects. This includes information such as the name, logo, connection parameters, test code, and the Docker container used to execute the actions. One integration definition file is required for each integration and serves as a container for all the actions that the integration will perform.
+
+The second type of file is an action definition file, which is used to define a single action that will be performed using the integration. Each integration action is defined in a separate action definition file, which will be associated with the appropriate integration definition. Action definition files are the files which contain the actual code which will be executed to perform the action. Supported languages include Perl, Python, PowerShell, and Bash. In addition to the action code, action definition files also contain information such as the name, required and optional fields, and the format in which the resulting information will be displayed.
+
+The following diagram shows the integration file hierarchy:
+
+
+
+Defining integrations at the action level allows users have greater flexibility in customizing existing integrations and sharing new actions with other users. For example, you may choose to extend the existing RSA NetWitness integration to include an additional action which retrieves all network connections for a given host. Once you create this new action, you can easily add it to the existing RSA Netwitness integration by uploading the new integration action file.
+
+You can also share this new action and use it to extend the functionality of the integration for others. The following diagram shows action file portability:
+
+
+
+## Integration framework file formats
+
+Both the integration definition file and the action definition file are YAML files. The following sections highlight the formats for each file type. [Example files](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/) contains samples of completed integration definition and action definition files as a reference. To see YAML files used in a working integration, see an example for Cloud SIEM in [Advanced example: Configure a custom integration](/docs/cse/automation/cloud-siem-automation-examples/#advanced-example-configure-a-custom-integration).
+
+### Integration definition file format
+
+`*` Required fields
+
+* **name** `*` [String]: Name displayed in the UI. It must match the `integration` field of each action definition file added to the integration.
+* **official_name** `*` [String] (Cloud SOAR only): To modify the display name of an integration in the Cloud SOAR UI while ensuring the actions YAML remains valid, set `official_name=OLD-NAME` and `name=NEW-NAME`.
+* **version** `*` [String]: File version number.
+* **icon** `*` [Base64 String]: Integration logo.
+* **script** `*`:
+ * **type** `*` [String]: Indicates which code parser should be used to execute the code within the integration and action definition files. All action definition files for the integration must use the same code language as defined in the integration definition file. Acceptable values are:
+ * `bash`
+ * `perl`
+ * `powershell`
+ * `python`
+ * **test_connection_code** `*` [String]: Code which can be used to test the integration through the UI by clicking on Test Saved Settings. Exiting with a value of `0` indicates success, while any other value will indicate failure.
+* **docker_repo_tag** `*` [String]: Docker repository tag of the image build the new container is from. Can be from any local or remote repository configured on the server.
+* **local_repo** [Boolean] (Cloud SOAR only): Indicates that the Docker image is a local one and not one present in the repository.
+* **configuration** `*`:
+ * **testable_connection** `*` [Boolean]: Is test code present (true/false).
+ * **require_proxy_config** `*` [Boolean]: True/false value indicating whether a proxy configuration tab should be available in the UI for the integration. If the value is set to true and a proxy is configured in the UI, the parameter `proxy_url` will be passed to the code on execution as an environment variable.
+ * **data_attributes** `*`: Fields required for configuration.
+ * **`
+
+The **New Integration** window allows you to upload an integration definition file by clicking **Select File**. Once you define the integration definition file, click **Save** to add the new integration.
+
+
+
+To edit an existing integration by uploading a new integration definition file , click on the **Edit** button. To export the integration definition file for the selected integration, click the **Export** icon.
+
+### Action definitions
+
+To add a new action, select the appropriate integration from the integrations list, then click on the **Upload** button to the right of the integration.
+
+The **New Action** window allows you to upload an action definition file by clicking **Select File**, and lets you select the kind of action.
+
+
+
+Once the action definition file has been selected, click **Save** to add the new action.
+
+Existing actions may be edited by clicking the **Upload** button below the action name to upload a new action definition file, or by clicking the **Edit** button below the action name to open a text editor and edit the action directly.
+
+
+
+To test an action, click on the **Test Action** button below the action name.
+
+
+
+Enter the required parameters and click **Test Action**.
+
+
+
+To export an action, click on the **Export** button below the action name.
+
+### Action definitions for Cloud SOAR
+
+The following action definitions are for Cloud SOAR only.
+
+#### Daemon action definitions
+
+Uploading an action YAML file with type Daemon allows you to specify Daemon action. You can also define rules associated with Daemon.
+
+
+
+Daemon action must return an array of objects in JSON format:
+
+```
+[{ 'a': 'a1', 'b': 'b1' }, { 'a': 'a2', 'b': 'b2' }]
+```
+
+Every object is processed by filter and action. It is also possible to define which output field should be passed to the next script run and an extra param key value pair to specialize each rule:
+
+
+
+All available actions are:
+* Create incident from template
+* Update incident
+* Close Incident
+* Change incident status
+* Add events to an existing incident
+* Change task progress
+* Close task
+* Add to Triage
+
+#### Scheduled action definitions
+
+A _Scheduled action_ represents a particular type of action when the execution is iterated until a specific exit condition is met. This type of action permits you to create loops in a playbook.
+
+YAML example:
+
+```
+ integration: 'Incident tools'
+ name: 'intervallo date loop'
+ type: Scheduled
+ script:
+ code: |
+ [......]
+ exit_condition:
+ - path: 'exit_condition'
+ string: 'false'
+ re-execution: 'force'
+ scheduled:
+ - every: '10s'
+ expire: '120s'
+ output:
+ - path : 'exit_condition'
+```
+
+Or using strings array:
+
+```
+ integration: 'Incident tools'
+ name: 'intervallo date loop'
+ type: Scheduled
+ script:
+ code: |
+ [......]
+ exit_condition:
+ - path: 'exit_condition'
+ string:
+ - 'Open'
+ - 'Pending'
+ - 'Waiting'
+ re-execution: 'force'
+ scheduled:
+ - every: '10s'
+ expire: '120s'
+ output:
+ - path : 'exit_condition'
+```
+
+Or using action's input:
+
+:::note
+If you use an action's input, this input field should be `required = true*`.
+:::
+
+```
+integration: 'Testing Purpose'
+name: 'testing Scheduled'
+type: Scheduled
+script:
+ code: |
+ [...]
+exit_condition:
+ - path: 'input.exit_condition_path'
+ string: "input.exit_condition_string"
+scheduled:
+ - every: 'input.scheduler_every'
+ expire: 'input.scheduler_expire'
+fields:
+ - id: scheduler_every
+ label: 'scheduler rate'
+ type: text
+ required: true
+ hint: "schedule rate i.e 1m 5m 1d (supported placeholder m=minutes, h=hours, d=days)"
+ - id: scheduler_expire
+ label: 'schedule expiration'
+ type: text
+ required: true
+ hint: "schedule expiration i.e 1m 5m 1d (supported placeholder m=minutes, h=hours, d=days)"
+ - id: exit_condition_path
+ label: 'output path'
+ type: text
+ required: true
+ hint: "output path to check"
+ - id: exit_condition_string
+ label: 'string to check'
+ type: tag
+ required: true
+ hint: "string to check"
+output:
+ - path : '[]."ip-dst_string"'
+ - path : '[].{Name: name, ID: _id, Address: address, FriendName: friends.[].name}'
+ - path : '[].tags.[] | unique()'
+ - path : '[].tags.[]'
+ - path : '[].guid | join(,)'
+ - path : '[].guid | join(SEPARATOR)'
+ - path : '[].guid'
+ - path : '[]._id'
+ - path : '[].guid'
+ - path : '[].isActive'
+ - path : '[].balance'
+ - path : '[].picture'
+ - path : '[].eyeColor'
+ - path : '[].name'
+ - path : '[].age'
+ - path : '[].gender'
+ - path : '[].company'
+ - path : '[].email'
+ - path : '[].phone'
+ - path : '[].address'
+ - path : '[].friends'
+```
+
+Field notes:
+* **re-execution**
+
+Depending on the logic that you want to implement in your triggers, specify a list of one or more [hooks](#trigger-hooks) in the trigger YAML file. Each hook represents a manual event or API endpoint that can invoke the trigger. For example, by specifying the hook `updateIncident` inside a trigger, the trigger will fire whenever the field of any incident is updated either manually from the UI or via the API.
+
+Triggers function as individual actions, executed in the backend, without the capability to review the execution output in the GUI except for triggers on entities (observables). If a trigger fails, error logs printed on the `stderr` output of the trigger are exported in the audit trail (system log verbosity must be set to `ALL` to review trigger audit logs). Triggers cannot receive manual input, except for [triggers with the `incidentCustomActions` and `taskCustomActions`hooks](#trigger-incidentcustomaction-and-taskcustomaction) that accept a text input.
+
+##### Examples of trigger definition files
+
+See the following examples of trigger definition files:
+* [Trigger definition file (Incident Tools)](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/#trigger-definition-file-incident-tools)
+* [Trigger taskCustomAction definition file (Incident Tools)](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/#trigger-taskcustomaction-definition-file-incident-tools)
+* [Trigger incidentCustomAction definition file (Incident Tools)](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/#trigger-incidentcustomaction-definition-file-incident-tools)
+* [Trigger webhook definition file](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/#trigger-webhook-definition-file)
+
+##### Trigger hooks
+
+Specify `hook` values in a `Trigger` type [action definition file](#action-definition-file-format) to run the trigger action in specific situations. For example, to automatically run a trigger action when a task is closed, specify the `closeTask` hook.
+
+The following sections describe the valid hook values to use in a trigger definition file.
+
+##### Entities hooks
+
+Following are the hooks for [entities](/docs/cloud-soar/incidents-triage/#entities) (observables) events that run when objects are created:
+* `addObservableArtifact`. When an artifact entity is created.
+* `addObservableDomain`. When a domain entity is created.
+* `addObservableIp`. When an IP address entity is created.
+* `addObservableMail`. When an email entity is created.
+* `addObservableUrl`. When a URL entity is created.
+* `addObservableUserDetail`. When a user detail entity is created.
+
+##### Task hooks
+
+Following are the hooks for [task](/docs/cloud-soar/incidents-triage/#tasks) events:
+* `approveTask`. When task is approved. Param passed to script `tasksDetail`.
+* `closeTask`. When task is closed. Param passed to script `tasksDetail`.
+* `createTask`. When task is created. Param passed to script `tasksDetail`.
+* `reassignTask`. When task is reassigned. Param passed to script `tasksDetail`.
+* `taskCustomAction`. Custom trigger. Param passed to script `text`. For more information, see [Trigger taskCustomAction](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/#trigger-taskcustomaction-definition-file-incident-tools).
+* `updateTask`. When task is updated. Param is passed to scripts `tasksBeforeUpdate` and `tasksAfterUpdate`.
+
+Params `tasksDetail`, `tasksBeforeUpdate`, and `tasksAfterUpdate` are JSON strings with the form:
+```json
+{
+reminder_time:
+
+Clicking the button in the UI runs the trigger:
+
+
+
+When users interact with the custom action trigger button, they can provide a textual input that can be elaborated by the trigger. To elaborate the input of a custom action trigger, use the `text` param inside the code:
+
+```json
+integration: 'Incident Tools'
+name: 'Custom trigger button'
+type: Trigger
+show_modal: true
+script:
+ code: |
+ import json
+ import argparse
+ import requests
+ import sys
+ parser = argparse.ArgumentParser()
+ parser.add_argument('--incidentsDetail', help='incident before update', required=False) #param inherited by
+ hook defined in the yaml
+ parser.add_argument('--token', help='JWT token , REQUIRED', required=True)
+ parser.add_argument('--incmanurl', help='IncMan URL , REQUIRED', required=True)
+ parser.add_argument('--text', help='text', required=False) #param inherited by hook defined in the yaml
+ args, unknown = parser.parse_known_args()
+ inc_det_after = json.loads(args.incidentsDetail)
+ incidentID = inc_det_after.get('id')
+ headers = {
+ 'Accept': 'application/json;charset=UTF-8',
+ 'Content-Type': 'application/x-www-form-urlencoded',
+ 'Authorization': 'Bearer ' + args.token
+ }
+ end_point = '{incmanurl}/api/v2/incidents/{incidentid}'.format(incmanurl=args.incmanurl, incidentid=incidentID)
+ session = requests.Session()
+ session.verify = False
+ additional_info = inc_det_after.get('additional_info')
+ new_text = "
+
+In the following example, a trigger ingests the JSON payload posted to the webhook endpoint, and writes its content in the description widget of a specific incident (ID 1743):
+
+```json
+type: Trigger
+script:
+ code: |
+ import json
+ import argparse
+ from datetime import datetime
+ import sys
+ import requests
+ import time
+ parser = argparse.ArgumentParser()
+ parser.add_argument('--payload', help='WebHook payload , REQUIRED', required=True)
+ parser.add_argument('--token', help='JWT token , REQUIRED', required=True)
+ parser.add_argument('--incmanurl', help='IncMan URL , REQUIRED', required=True)
+ args, unknown = parser.parse_known_args()
+ payload = json.dumps(args.payload)
+ incidentID = 1743
+ headers = {
+ 'Accept': 'application/json;charset=UTF-8',
+ 'Content-Type': 'application/x-www-form-urlencoded',
+ 'Authorization': 'Bearer ' + args.token
+ }
+ end_point = '{incmanurl}/api/v2/incidents/{incidentid}'.format(incmanurl=args.incmanurl,
+incidentid=incidentID)
+ session = requests.Session()
+ session.verify = False
+ additional_info = json.loads(payload)
+ payload = {
+ "additional_info": additional_info,
+ }
+ incident = session.put(end_point, headers=headers, data=payload, proxies=None, timeout=(5,60))
+ sys.stderr.write(str(incident.content))
+ try:
+ incident.raise_for_status()
+ except Exception as e:
+ sys.stderr.write("Error updating incident Severity: ")
+ sys.stderr.write(str(e))
+ # sys.stderr.write(str(json.dumps(args.triage_eventsDetail)))
+ exit(0)
+hook:
+ - webhook
+```
+
+For another example YAML file of a webhook trigger, see [Trigger webhook definition file](/docs/platform-services/automation-service/integration-framework/example-files-integration-framework/#trigger-webhook-definition-file).
\ No newline at end of file
diff --git a/docs/platform-services/automation-service/integration-framework/docker-integration-framework.md b/docs/platform-services/automation-service/integration-framework/docker-integration-framework.md
new file mode 100644
index 0000000000..7384b04cb0
--- /dev/null
+++ b/docs/platform-services/automation-service/integration-framework/docker-integration-framework.md
@@ -0,0 +1,102 @@
+---
+id: docker-integration-framework
+title: Use Docker with the Integration Framework for Cloud SOAR
+sidebar_label: Docker for Cloud SOAR
+description: Learn how to use Docker with the integration framework for Cloud SOAR.
+---
+
+import useBaseUrl from '@docusaurus/useBaseUrl';
+
+For Cloud SOAR, you can use the integration framework to execute all the actions of an integration in a container built from a custom Docker image. This is particularly useful, for example, if you want to improve actions by taking advantage of third-party libraries. In that case, you can install those third-party libraries in the Docker container where actions will be executed making them available to the interpreter of the action scripts. However, there are many other ways in which using a custom Docker image can allow you to customize your integrations and actions.
+
+## Custom Docker image
+
+### Steps to create a custom Docker image
+
+1. Go to the **Integrations** page.
+1. Look for the integration for which you need to create a custom Docker image and click on it.
+1. Next to the name of the integration, you will see two buttons. Click on the one that is on the far right and has the Docker logo on it.
+1. Type a name for your custom image in the **Docker image tag** field. This is a required field.
+1. When you are creating a new custom Docker image, you will see the **Last update** field is showing **Never edited before**. The text area below allows you to write a Dockerfile with the instructions to build your custom image:
+1. Proceed to write your custom Dockerfile as you would normally do. If you need tips on how to do this, refer to [Useful Docker commands](#useful-docker-commands) or check the Docker official documentation. Keep in mind that the following statements are not currently available, which means they will be ignored when building the image: `COPY`, `WORKDIR`, `EXPOSE`, `ADD`, `ENTRYPOINT`, `USER`, `ARG`, and `STOPSIGNAL`.
+1. In the editor you will see there is a dropdown menu above the text area that reads **Valid Instructions**. This dropdown menu enumerates in a descriptive way a set of instructions that you can use in your Dockerfile. If you choose them from the dropdown menu, a new line will be added to your Dockerfile with the keyword to start the statement, so you can pick up from there. The use of this dropdown menu is completely optional and you can write your Dockerfile directly in the text area.
+1. As soon as you change something in your Dockerfile, a **Save** button will appear next to the Docker editor button. Click on it if you are ready to save your custom Dockerfile.
+
+Once you have saved a custom Dockerfile, the integration will be executed on a container built from the relative custom Docker image.
+
+### Testing your custom Docker image
+
+We strongly suggest that you test your custom images as soon as you create or modify them. If by any chance you save a faulty custom Dockerfile, when the actions from that integration are triggered, their execution will fail because the Docker image will fail as well.
+
+1. To test your custom images, click where it says **TEST IMAGE** at the bottom right corner of the editor.
+1. If your custom Docker image was built without error, a success message will pop up in your screen. Otherwise, if a proper image cannot be built from your custom Dockerfile, an error message will pop up, containing details on what went wrong. In that case, it is very important that you correct your Dockerfile and test it again until an image is built successfully. As an alternative, you can always revert to the original Docker image used by the integration, by clicking on Reset Default Image at the bottom of the editor.
+
+### Deleting your custom Docker image and reverting to the original one
+
+The **RESET DEFAULT IMAGE** button that appears at the bottom of the editor allows you to delete your custom Docker image and revert to the original Docker image used for that integration.
+
+As soon as you click on it, the integration will start being executed again in a container based on the original integration image. Notice that your custom Dockerfile will be lost, so you will have to write it again if you want to revert back to your custom image.
+
+### Checking whether an integration is using a custom Docker image
+
+If you have appropriate permissions, you can set up custom Docker images for any integration. There are many ways to check if any of your integrations are being executed in a container from a custom Docker image.
+
+If you go to the Docker YAML editor of the integration, if it is using a custom image, a comment above the `docker_repo_tag` will tell you so:
+
+
+
+However, the best way to check whether an integration is using a custom image is to open the Docker editor for that integration (by clicking on the button with the Docker logo on it).
+
+Integrations using a custom image will have a **Docker image tag**, a **Last update** date, and some Dockerfile content.
+
+On the other hand, integrations that are not using a custom Docker image will have an empty Docker editor showing **Never edited before** in the **Last update** field:
+
+
+
+## Useful Docker commands
+
+The following commands may be useful in performing some common actions in Docker on your server. See the official Docker documentation at https://docs.docker.com/ for additional information.
+
+### Create Docker image (Python example)
+
+1. Create a file with name dockerfile inside a dedicated directory:
+ ```
+ FROM python:2-slim
+ RUN pip install --trusted-host pypi.python.org requests suds
+ ```
+1. Inside this directory run:
+ ```
+ docker build -t Get an overview of how the integration framework works.
+See example files for the integration framework.
+Learn how to work with output from the integration framework.
+Learn how to use Docker with the integration framework for Cloud SOAR.
+
+
+The JSON view will display the entire output of the integration action in JSON format:
+
+
+
+Following is the setting for a link type:
+
+```
+- display_name: 'CVSS'
+value : 'cvss'
+type : 'link'
+```
+
+
+
+### Add an output type for an action
+
+It's possible to specify a JSON path or use rawOutput to specify text output to use as `srcDoc` for iframe sandbox (it is not possible to use JavaScript):
+
+```
+integration: 'Incident tools'
+name: 'intervallo date 3ededed'
+type: Enrichment
+script:
+ code: |
+ [....]
+ art = '''
+
+
+
+
+
+
+The `image_base64_png(jpg)` field provides the result path where to get base64 png or jpg image, for example:
+
+
+
+
+
+## Use multi-select in output
+
+```
+ [
+ {
+ "_id": "5fda1d0faa3f39c44361b84e",
+ "index": 0,
+ "days": days,
+ "guid": "900c39df-837f-4394-a463-f0dffdb5420e",
+ "isActive": False,
+ "balance": "$2,434.45",
+ "picture": "http://placehold.it/32x32",
+ "age": 37,
+ "eyeColor": "brown",
+ "name": "Lindsey Mcknight",
+ "gender": "male",
+ "company": "PORTALIS",
+ "email": "lindseymcknight@portalis.com",
+ "phone": "+1 (868) 490-3497",
+ "address": "566 Bainbridge Street, Waterloo, Nebraska, 1714",
+ "about": "Sunt quis culpa enim eiusmod ullamco tempor enim. Culpa nisi nostrud quis nisi commodo mollit mollit irure. Duis sunt reprehenderit duis labore dolor dolor ullamco Lorem eiusmod. Nulla nulla excepteur ipsum dolor qui reprehenderit laborum elit esse nulla do incididunt. Ea qui tempor sunt veniam magna do ea laborum qui ut. Veniam veniam ut consequat duis. Commodo incididunt duis culpa mollit eu.\r\n",
+ "registered": "2015-12-30T12:58:10 -01:00",
+ "latitude": -78.618655,
+ "longitude": -148.652818,
+ "tags": [
+ "et",
+ "do",
+ "ut",
+ "excepteur",
+ "dolore",
+ "cillum",
+ "laborum"
+ ],
+ "friends": [
+ {
+ "id": 0,
+ "name": "Herman Sharp"
+ },
+ {
+ "id": 1,
+ "name": "Foreman Berger"
+ },
+ {
+ "id": 2,
+ "name": "Loretta Blair"
+ }
+ ],
+ "greeting": "Hello, Lindsey Mcknight! You have 3 unread messages.",
+ "favoriteFruit": "apple"
+ },
+ {
+ "_id": "5fda1d0fa888a79dbbf27e40",
+ "index": 1,
+ "guid": "246533ba-31ca-4cd0-b307-e3c8ff450ff1",
+ "isActive": True,
+ "balance": "$3,469.55",
+ "picture": "http://placehold.it/32x32",
+ "age": 21,
+ "eyeColor": "brown",
+ "name": "Johnston Merritt",
+ "gender": "male",
+ "company": "TERRAGEN",
+ "email": "johnstonmerritt@terragen.com",
+ "phone": "+1 (960) 583-2954",
+ "address": "405 Oxford Walk, Sunnyside, Alaska, 4197",
+ "about": "Voluptate cillum deserunt veniam ullamco in culpa ad amet ut ea. Sit et reprehenderit deserunt reprehenderit consequat anim elit pariatur sint irure proident. Non sint velit mollit irure amet aute in ad. In amet magna consectetur esse dolor Lorem est proident.\r\n",
+ "registered": "2016-10-27T03:29:02 -02:00",
+ "latitude": 73.096704,
+ "longitude": 98.965585,
+ "tags": [
+ "consequat",
+ "adipisicing",
+ "esse",
+ "ad",
+ "laborum",
+ "pariatur",
+ "sunt"
+ ],
+ "friends": [
+ {
+ "id": 0,
+ "name": "Viola Bailey"
+ },
+ {
+ "id": 1,
+ "name": "Jodi Richardson"
+ },
+ {
+ "id": 2,
+ "name": "Strong Patel"
+ }
+ ],
+ "greeting": "Hello, Johnston Merritt! You have 6 unread messages.",
+ "favoriteFruit": "apple"
+ }
+ ]
+```
+With a result as shown above it's possible to add into OIF YAML output section:
+
+```
+output:
+ - path : '[].{Name: name, ID: _id, Address: address}'
+
+```
+And if you use that output into `textarea` as placeholder:
+
+
+
+You will get a print HTML of aggregated elements
+
+
+
+## Pipe functions in YAML output
+
+With the same action used in [Use mult-select in output](#use-multi-select-in-output), it's possible to use two common pipe functions to process action output.
+
+Pipe function `join('separator')`:
+
+```
+output:
+ - path : '[].guid | join(,)'
+```
+
+
+
+And so the next action will run one time with a string created join array element with `separator` specified:
+
+
+
+Pipe function `unique()`:
+```
+output:
+ - path : '[].tags.[] | unique()'
+```
+
+
+
+The array will be populated with not duplicated element:
+
+
diff --git a/sidebars.ts b/sidebars.ts
index a7f3317a3e..a45890c415 100644
--- a/sidebars.ts
+++ b/sidebars.ts
@@ -3184,7 +3184,19 @@ integrations: [
'platform-services/automation-service/automation-service-integrations',
'platform-services/automation-service/automation-service-audit-logging',
'platform-services/automation-service/automation-service-bridge',
- 'platform-services/automation-service/automation-service-integration-framework',
+ {
+ type: 'category',
+ label: 'Integration Framework',
+ collapsible: true,
+ collapsed: true,
+ link: {type: 'doc', id: 'platform-services/automation-service/integration-framework/index'},
+ items: [
+ 'platform-services/automation-service/integration-framework/about-integration-framework',
+ 'platform-services/automation-service/integration-framework/example-files-integration-framework',
+ 'platform-services/automation-service/integration-framework/integration-framework-output',
+ 'platform-services/automation-service/integration-framework/docker-integration-framework',
+ ],
+ },
],
},
],