diff --git a/explore-analyze/images/workflows-alerting-rule-action.png b/explore-analyze/images/workflows-alerting-rule-action.png
new file mode 100644
index 0000000000..25b8ec0939
Binary files /dev/null and b/explore-analyze/images/workflows-alerting-rule-action.png differ
diff --git a/explore-analyze/images/workflows-detection-rule-action.png b/explore-analyze/images/workflows-detection-rule-action.png
new file mode 100644
index 0000000000..ae25328b1c
Binary files /dev/null and b/explore-analyze/images/workflows-detection-rule-action.png differ
diff --git a/explore-analyze/images/workflows-editor.png b/explore-analyze/images/workflows-editor.png
new file mode 100644
index 0000000000..91ca8a4012
Binary files /dev/null and b/explore-analyze/images/workflows-editor.png differ
diff --git a/explore-analyze/images/workflows-page.png b/explore-analyze/images/workflows-page.png
new file mode 100644
index 0000000000..c30c8ca874
Binary files /dev/null and b/explore-analyze/images/workflows-page.png differ
diff --git a/explore-analyze/toc.yml b/explore-analyze/toc.yml
index 076c24b741..73d3394a55 100644
--- a/explore-analyze/toc.yml
+++ b/explore-analyze/toc.yml
@@ -402,4 +402,35 @@ toc:
- file: alerts-cases/cases/manage-cases.md
- file: alerts-cases/cases/manage-cases-settings.md
- file: alerts-cases/cases/cases-as-data.md
- - file: numeral-formatting.md
+ - file: workflows.md
+ children:
+ - file: workflows/setup.md
+ - file: workflows/get-started.md
+ - file: workflows/core-components.md
+ children:
+ - file: workflows/triggers.md
+ children:
+ - file: workflows/triggers/manual-triggers.md
+ - file: workflows/triggers/scheduled-triggers.md
+ - file: workflows/triggers/alert-triggers.md
+ - file: workflows/steps.md
+ children:
+ - file: workflows/steps/action-steps.md
+ children:
+ - file: workflows/steps/elasticsearch.md
+ - file: workflows/steps/kibana.md
+ - file: workflows/steps/external-systems-apps.md
+ - file: workflows/steps/flow-control-steps.md
+ children:
+ - file: workflows/steps/if.md
+ - file: workflows/steps/foreach.md
+ - file: workflows/steps/wait.md
+ - file: workflows/data.md
+ children:
+ - file: workflows/data/templating.md
+ - file: workflows/author-workflows.md
+ - file: workflows/monitor-troubleshoot.md
+ - file: workflows/manage-workflows.md
+ - hidden: workflows/use-cases.md
+ - file: workflows/templates.md
+ - file: numeral-formatting.md
\ No newline at end of file
diff --git a/explore-analyze/workflows.md b/explore-analyze/workflows.md
new file mode 100644
index 0000000000..cb19a8484a
--- /dev/null
+++ b/explore-analyze/workflows.md
@@ -0,0 +1,107 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about Elastic workflows.
+---
+
+# Workflows [workflows-overview]
+
+A workflow is a defined sequence of steps designed to achieve a specific outcome through automation. It is a reusable, versionable "recipe" that transforms inputs into actions.
+
+## Why use workflows [workflows-why]
+
+Insight into your data isn't enough. The ultimate value lies in action and outcomes. Workflows complete the journey from data to insights to automated outcomes. Your critical operational data already lives in the Elastic cluster: security events, infrastructure metrics, application logs, and business context. Workflows let you automate end-to-end processes to achieve outcomes directly where that data lives, without needing external automation tools.
+
+Workflows address common operational challenges, such as:
+
+* **Alert fatigue**: Automate responses to reduce manual triage.
+* **Understaffing**: Enable teams to do more with fewer resources.
+* **Manual, repetitive work**: Automate routine tasks consistently.
+* **Tool fragmentation**: Eliminate the need to add on external automation tools.
+
+Workflows can handle a wide range of tasks, from simple, repeatable steps to complex processes.
+
+## Who should use workflows [workflows-who]
+
+Workflows are for you if you want to cut down on manual effort, speed up response times, and make sure recurring situations are handled consistently.
+
+## Key concepts [workflows-concepts]
+
+Some key concepts to understand while working with workflows:
+
+* **Triggers**: The events or conditions that initiate a workflow. Refer to [](/explore-analyze/workflows/triggers.md) to learn more.
+* **Steps**: The individual units of logic or action that make up a workflow. Refer to [](/explore-analyze/workflows/steps.md) to learn more.
+* **Data**: How data flows through your workflow, including inputs, constants, context variables, step outputs, and Liquid templating for dynamic values. Refer to [](/explore-analyze/workflows/data.md) to learn more.
+
+## Workflow structure [workflow-structure]
+
+Workflows are defined in YAML. In the YAML editor, describe _what_ the workflow should do, and the platform handles execution.
+
+```yaml
+# ═══════════════════════════════════════════════════════════════
+# METADATA - Identifies and describes the workflow
+# ═══════════════════════════════════════════════════════════════
+name: My Workflow # Required: Unique identifier
+description: What this workflow does # Optional: Shown in UI
+enabled: true # Optional: Enable or disable execution
+tags: ["demo", "production"] # Optional: For organizing workflows
+
+# ═══════════════════════════════════════════════════════════════
+# CONSTANTS - Reusable values defined once, used throughout
+# ═══════════════════════════════════════════════════════════════
+consts:
+ indexName: "my-index"
+ environment: "production"
+ alertThreshold: 100
+ endpoints: # Can be objects/arrays
+ api: "https://api.example.com"
+ backup: "https://backup.example.com"
+
+# ═══════════════════════════════════════════════════════════════
+# INPUTS - Parameters passed when the workflow is triggered
+# ═══════════════════════════════════════════════════════════════
+inputs:
+ - name: environment
+ type: string
+ required: true
+ default: "staging"
+ description: "Target environment"
+ - name: dryRun
+ type: boolean
+ default: true
+
+# ═══════════════════════════════════════════════════════════════
+# TRIGGERS - How/when the workflow starts
+# ═══════════════════════════════════════════════════════════════
+triggers:
+ - type: manual # User clicks Run button
+ # - type: scheduled # Runs on a schedule
+ # with:
+ every: 1d
+ # - type: alert # Triggered by an alert
+
+# ═══════════════════════════════════════════════════════════════
+# STEPS - The actual workflow logic (executed in order)
+# ═══════════════════════════════════════════════════════════════
+steps:
+ - name: step_one
+ type: elasticsearch.search
+ with:
+ index: "{{consts.indexName}}" # Reference constants
+ query:
+ match_all: {}
+
+ - name: step_two
+ type: console
+ with:
+ message: |
+ Environment: {{inputs.environment}} # Reference inputs
+ Found: {{steps.step_one.output.hits.total.value}} # Reference step output
+
+```
+
+## Learn more
+
+- To create and run your first workflow, refer to [](/explore-analyze/workflows/get-started.md).
+- Understand how to use the YAML editor in {{kib}} to define and run your workflows. Refer to [](/explore-analyze/workflows/author-workflows.md) to learn more.
diff --git a/explore-analyze/workflows/author-workflows.md b/explore-analyze/workflows/author-workflows.md
new file mode 100644
index 0000000000..6a07b6f2b7
--- /dev/null
+++ b/explore-analyze/workflows/author-workflows.md
@@ -0,0 +1,35 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Reference guide for the workflow YAML editor interface.
+---
+
+# Author workflows [workflows-yaml-editor]
+
+The YAML editor is the primary interface for creating and editing workflows. This page describes the editor's components and features.
+
+::::{admonition} Requirements
+To use workflows, you must turn on the feature and ensure your role has the appropriate privileges. Refer to [](setup.md) for more information.
+
+You must also have the appropriate subscription. Refer to the subscription page for [Elastic Cloud](https://www.elastic.co/subscriptions/cloud) and [Elastic Stack/self-managed](https://www.elastic.co/subscriptions) for the breakdown of available features and their associated subscription tiers.
+::::
+
+
+:::{image} /explore-analyze/images/workflows-editor.png
+:alt: A view of Workflows editor
+:screenshot:
+:::
+
+## Editor layout [workflows-editor-layout]
+
+The editor layout is composed of the following elements:
+
+| Component | Description |
+|-----------|-------------|
+| **Editor pane** | The main area for writing and editing workflows. To learn more about the expected workflow structure, refer to [](/explore-analyze/workflows.md) |
+| **Actions menu** | A quick-add menu for pre-formatted [triggers](triggers.md) and [step types](steps.md). |
+| **Save button** | Saves the current workflow. |
+| **Run button** | Manually runs the entire workflow or an individual step.
- Entire workflow: Click the **Run** icon {icon}`play` (next to **Save**).
- Individual step: Select the step in the editor pane, then click the **Run** icon {icon}`play`. |
+| **Executions tab** | Shows [execution history](monitor-troubleshoot.md) and real-time logs. |
+| **Validation logs** | Shows validation successes and failures. Some common validation errors include:
- Invalid YAML syntax because of incorrect indentation or formatting
- Missing a required field or property (for example, `name`, `type`)
- The step type is unknown or doesn't match a valid action
- Invalid template syntax because of malformed template expression|
\ No newline at end of file
diff --git a/explore-analyze/workflows/core-components.md b/explore-analyze/workflows/core-components.md
new file mode 100644
index 0000000000..20cddf4a63
--- /dev/null
+++ b/explore-analyze/workflows/core-components.md
@@ -0,0 +1,34 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about the core components that make up Elastic workflows.
+---
+
+# Core components
+
+Workflows are composed of three core elements that make workflow automation possible: triggers, steps, and connectors. Together, these components define when workflows run, what they do, and what external systems they connect to.
+
+## Triggers
+
+Triggers define _when_ a workflow runs. A trigger is an event or condition that initiates a workflow, such as an alert firing or a scheduled time occurring. Every workflow begins with a trigger.
+
+Examples of triggers include:
+
+* A user runs a workflow manually
+* A specific time or interval is reached
+* A detection alert is generated
+
+For more information, refer to [](/explore-analyze/workflows/triggers.md).
+
+## Steps
+
+Steps define _what_ a workflow does. A step is an individual unit of logic or action within a workflow. Steps control how data moves, how decisions are made, and what results are produced. Workflows can contain one or more steps, executed in sequence.
+
+For more information, refer to [](/explore-analyze/workflows/steps.md).
+
+## {{connectors-ui}}
+
+{{connectors-ui}} define _where_ workflows can reach. A connector is the interface between {{kib}} and an external system, allowing workflows to act on or respond to events and services outside of {{kib}}.
+
+For more information, refer to [](/explore-analyze/workflows/steps/external-systems-apps.md#connector-based-actions).
diff --git a/explore-analyze/workflows/data.md b/explore-analyze/workflows/data.md
new file mode 100644
index 0000000000..83e7b6cb90
--- /dev/null
+++ b/explore-analyze/workflows/data.md
@@ -0,0 +1,199 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn how data flows through workflows, use dynamic templating, and handle errors gracefully.
+---
+
+# Data and error handling [workflows-data]
+
+A key feature of workflows is the ability to pass data between steps and handle failures gracefully. This page explains the mechanisms for controlling data flow and building resilient, fault-tolerant automations.
+
+## Data flow [workflows-data-flow]
+
+Every step in a workflow produces an output. By default, this output is added to a global `steps` object in the workflow's context, making it available to all subsequent steps.
+
+### Access step outputs [workflows-access-outputs]
+
+Use the following syntax to access the output of a specific step:
+
+```text
+steps..output
+```
+
+You can also access error information from a step:
+
+```text
+steps..error
+```
+
+### Example: Chain steps to move output data [workflows-chain-steps-example]
+
+This example demonstrates a common pattern: searching for data in one step and using the results in a later step. In this case, the workflow searches for a specific user's full name, then uses it to create a new security case.
+
+```yaml
+name: Create case for a specific user
+steps:
+ - name: find_user_by_id
+ type: elasticsearch.search
+ with:
+ index: "my-user-index"
+ query:
+ term:
+ user.id: "u-123"
+
+ - name: create_case_for_user
+ type: kibana.createCaseDefaultSpace
+ with:
+ title: "Investigate user u-123"
+ description: "A case has been opened for user {{steps.find_user_by_id.output.hits.hits[0]._source.user.fullName}}."
+ tags: ["user-investigation"]
+ connector:
+ id: "none"
+ name: "none"
+ type: ".none"
+```
+
+In this example:
+
+1. The `find_user_by_id` step searches an index for a document.
+2. The `create_case_for_user` step uses the output of the first step to enrich a new [{{elastic-sec}} case](../../solutions/security/investigate/cases.md).
+3. The `description` field accesses `steps.find_user_by_id.output.hits.hits[0]._source.user.fullName` to dynamically include the user's full name in the case description.
+
+## Error handling [workflows-error-handling]
+
+By default, if any step in a workflow fails, the entire workflow execution stops immediately. You can override this behavior using the `on-failure` block, which supports retry logic, fallback steps, and continuation options.
+
+### Configuration levels [workflows-on-failure-levels]
+
+You can configure `on-failure` at two levels:
+
+**Step-level** — applies to a specific step:
+
+```yaml
+steps:
+ - name: api-call
+ type: http
+ on-failure:
+ retry:
+ max-attempts: 3
+ delay: "5s"
+```
+
+**Workflow-level** (configured under `settings`) - applies to all steps as the default error handling behavior:
+
+```yaml
+settings:
+ on-failure:
+ retry:
+ max-attempts: 2
+ delay: "1s"
+steps:
+ - name: api-call
+ type: http
+```
+
+:::{note}
+Step-level `on-failure` configuration always overrides workflow-level settings.
+:::
+
+### Retry [workflows-on-failure-retry]
+
+Retries the failed step a configurable number of times, with an optional delay between attempts.
+
+```yaml
+on-failure:
+ retry:
+ max-attempts: 3 # Required, minimum 1 (for example, "1", "2", "5")
+ delay: "5s" # Optional, duration format (for example, "5s", "1m", "2h")
+```
+
+The workflow fails when all retries are exhausted.
+
+### Fallback [workflows-on-failure-fallback]
+
+Executes alternative steps after the primary step fails and all retries are exhausted. In the following example, when the `delete_critical_document` step fails, the workflow executes two additional steps: one sends a Slack notification to devops-alerts using `{{workflow.name}}`, while the other logs the error details from the failed step using `{{steps.delete_critical_document.error}}`.
+
+```yaml
+on-failure:
+ fallback:
+ - name: notify_on_failure
+ type: slack
+ connector-id: "devops-alerts"
+ with:
+ message: "Failed to delete document in workflow '{{workflow.name}}'"
+ - name: log_failure
+ type: console
+ with:
+ message: "Document deletion failed, error: {{steps.delete_critical_document.error}}"
+```
+
+Within fallback steps, access error information from the failed primary step using `steps..error`.
+
+### Continue [workflows-on-failure-continue]
+
+Continues workflow execution even if a step fails. The failure is recorded, but does not interrupt the workflow.
+
+```yaml
+on-failure:
+ continue: true
+```
+
+### Combining options [workflows-on-failure-combining]
+
+You can combine multiple failure-handling options. They are processed in this order: retry → fallback → continue.
+
+In the following example:
+1. The step retries up to 2 times with a 1-second delay.
+2. If all retries fail, the fallback steps execute.
+3. The workflow continues regardless of the outcome.
+
+```yaml
+- name: create_ticket
+ type: jira
+ connector-id: "my-jira-project"
+ with:
+ projectKey: "PROJ"
+ summary: "New issue from workflow"
+ on-failure:
+ retry:
+ max-attempts: 2
+ delay: "1s"
+ fallback:
+ - name: notify_jira_failure
+ type: slack
+ connector-id: "devops-alerts"
+ with:
+ message: "Warning: Failed to create ticket. Continuing workflow."
+ continue: true
+```
+
+### Restrictions [workflows-on-failure-restrictions]
+
+- Flow-control steps (`if`, `foreach`) cannot have workflow-level `on-failure` configurations.
+- Fallback steps execute only after all retries have been exhausted.
+- When combined, failure-handling options are processed in this order: retry → fallback → continue.
+
+## Dynamic values with templating [workflows-dynamic-values]
+
+To inject dynamic values into your workflow steps, use the templating engine. The templating engine uses the [Liquid templating language](https://liquidjs.com/) and allows you to:
+
+- **Reference step outputs**: Access data from previous steps using `steps..output`.
+- **Use constants**: Reference workflow-level constants with `consts.`.
+- **Apply filters**: Transform values with filters like `upcase`, `downcase`, and `date`.
+- **Add conditional logic**: Use `if`/`else` statements for dynamic content.
+- **Loop through data**: Iterate over arrays with `for` loops.
+
+For complete syntax details and examples, refer to [Templating engine](./data/templating.md).
+
+## Quick reference [workflows-data-quick-reference]
+
+By combining data flow, templating, and robust error handling, you can build complex, reliable automations that react to dynamic conditions and recover from unexpected failures.
+
+| Action | Syntax | Description |
+|---------|--------|-------------|
+| Step output | `steps..output` | Access the result of a previous step. |
+| Step error | `steps..error` | Access error details from a failed step. |
+| Retry on failure | `on-failure.retry` | Retry a failed step with optional delay. |
+| Fallback steps | `on-failure.fallback` | Define recovery actions when a step fails. |
+| Continue on failure | `on-failure.continue: true` | Allow the workflow to proceed after a failure. |
\ No newline at end of file
diff --git a/explore-analyze/workflows/data/templating.md b/explore-analyze/workflows/data/templating.md
new file mode 100644
index 0000000000..ba5e3778ac
--- /dev/null
+++ b/explore-analyze/workflows/data/templating.md
@@ -0,0 +1,297 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn how to use the Liquid templating engine to create dynamic workflows.
+---
+
+# Templating engine [workflows-templating]
+
+The workflow templating engine enables dynamic, type-safe template rendering using the [Liquid templating language](https://liquidjs.com/). It allows you to inject variables, apply transformations, and control data flow throughout your workflows.
+
+## Syntax overview [workflows-template-syntax]
+
+The templating engine supports several syntax patterns for different use cases:
+
+| Syntax | Purpose | Example |
+|--------|---------|---------|
+| Double curly braces | Insert values as strings | `"Hello, {{name}}"` |
+| Dollar-sign prefix | Preserve data types (arrays, objects, numbers) | `${{myArray}}` |
+| Percent tags | Control flow (conditionals, loops) | `{%if active%}...{%endif%}` |
+| Raw tags | Output literal curly braces | `{%raw%}{{}}{%endraw%}` |
+
+### String interpolation [workflows-string-interpolation]
+
+Use double curly braces for basic string interpolation. Variables and expressions inside the braces are evaluated and rendered as strings.
+
+```yaml
+message: "Hello {{user.name}}!" # Result: "Hello Alice"
+url: "https://api.example.com/users/{{user.id}}" # Result: "https://api.example.com/users/12"
+```
+
+### Type-preserving expressions [workflows-type-preserving]
+
+Use the dollar-sign prefix (`${{ }}`) when you need to preserve the original data type (array, object, number, boolean).
+
+```yaml
+# String syntax - converts to string
+tags: "{{inputs.tags}}" # Result: "[\"admin\", \"user\"]" (string)
+
+# Type-preserving syntax - keeps original type
+tags: "${{inputs.tags}}" # Result: ["admin", "user"] (actual array)
+```
+
+:::{important}
+The type-preserving syntax must occupy the entire string value. You cannot mix it with other text.
+
+✅ **Valid:**
+
+```yaml
+tags: "${{inputs.tags}}"
+```
+
+❌ **Invalid:**
+
+```yaml
+message: "Tags are: ${{inputs.tags}}"
+```
+:::
+
+| Feature | String syntax | Type-preserving syntax |
+|---------|---------------|------------------------|
+| Output type | Always string | Preserves original type |
+| Arrays | Stringified | Actual array |
+| Objects | Stringified | Actual object |
+| Booleans | `"true"` / `"false"` | `true` / `false` |
+| Numbers | `"123"` | `123` |
+
+### Control flow [workflows-control-flow]
+
+Liquid tags are control flow constructs that use the `{% %}` syntax. Unlike output expressions, tags execute logic without directly rendering a value.
+
+**Conditionals:**
+
+```yaml
+message: |
+ {% if user.role == 'admin' %}
+ Welcome, administrator!
+ {% else %}
+ Welcome, user!
+ {% endif %}
+```
+
+**Loops:**
+
+```yaml
+message: |
+ {% for item in items %}
+ - {{item.name}}
+ {% endfor %}
+```
+
+### Escaping template syntax [workflows-escaping]
+
+Use raw tags to output literal curly brace characters without rendering them:
+
+```yaml
+value: "{%raw%}{{_ingest.timestamp}}{%endraw%}" # Result: "{{_ingest.timestamp}}"
+```
+
+## Working with data [workflows-working-with-data]
+
+This section covers common patterns for accessing and transforming data in your workflows.
+
+### Reference inputs [workflows-ref-inputs]
+
+Reference input parameters defined in the workflow using `{{inputs.}}`. Inputs are defined at the workflow level and can be provided when the workflow is triggered manually.
+
+```yaml
+inputs:
+ - name: environment
+ type: string
+ required: true
+ default: "staging"
+ - name: batchSize
+ type: number
+ default: 100
+
+triggers:
+ - type: manual
+
+steps:
+ - name: log_config
+ type: console
+ with:
+ message: |
+ Running with:
+ - Environment: {{inputs.environment}}
+ - Batch Size: {{inputs.batchSize}}
+```
+
+### Reference outputs [workflows-ref-step-outputs]
+
+Access output data from previous steps using `{{steps..output}}`:
+
+```yaml
+steps:
+ - name: search_users
+ type: elasticsearch.search
+ with:
+ index: "users"
+ query:
+ term:
+ status: "active"
+
+ - name: send_notification
+ type: slack
+ connector-id: "my-slack"
+ with:
+ message: "Found {{steps.search_users.output.hits.total.value}} active users"
+```
+
+### Reference constants [workflows-ref-constants]
+
+Reference workflow-level constants using `{{consts.}}`. Constants are defined at the workflow level and can be referenced when the workflow is triggered.
+
+```yaml
+consts:
+ indexName: "my-index"
+ environment: "production"
+
+steps:
+ - name: search_data
+ type: elasticsearch.search
+ with:
+ index: "{{consts.indexName}}"
+ query:
+ match:
+ env: "{{consts.environment}}"
+```
+
+### Apply filters [workflows-apply-filters]
+
+Transform values using filters with the pipe `|` character:
+
+```yaml
+message: |
+ User: {{user.name | upcase}}
+ Email: {{user.email | downcase}}
+ Created: {{user.created_at | date: "%Y-%m-%d"}}
+```
+
+### Preserve array and object types [workflows-preserve-types]
+
+When passing arrays or objects between steps, use the type-preserving syntax (`${{ }}`) to avoid stringification:
+
+```yaml
+steps:
+ - name: get_tags
+ type: elasticsearch.search
+ with:
+ index: "config"
+ query:
+ term:
+ type: "tags"
+
+ - name: create_document
+ type: elasticsearch.index
+ with:
+ index: "reports"
+ document:
+ # Preserves the array type, doesn't stringify it
+ tags: "${{steps.get_tags.output.hits.hits[0]._source.tags}}"
+```
+
+:::{important}
+The type-preserving syntax must occupy the entire string value. You cannot mix it with other text.
+
+✅ **Valid:**
+
+```yaml
+tags: "${{inputs.tags}}"
+```
+
+❌ **Invalid:**
+
+```yaml
+message: "Tags are: ${{inputs.tags}}"
+```
+:::
+
+### Use conditionals for dynamic content [workflows-conditionals-example]
+
+Add logic to customize output based on data:
+
+```yaml
+steps:
+ - name: send_message
+ type: slack
+ connector-id: "alerts"
+ with:
+ message: |
+ {% if steps.search.output.hits.total.value > 100 %}
+ ⚠️ HIGH ALERT: {{steps.search.output.hits.total.value}} events detected!
+ {% else %}
+ ✅ Normal: {{steps.search.output.hits.total.value}} events detected.
+ {% endif %}
+```
+
+### Loop through results [workflows-loops-example]
+
+Iterate over arrays to process multiple items:
+
+```yaml
+steps:
+ - name: summarize_results
+ type: console
+ with:
+ message: |
+ Found users:
+ {% for hit in steps.search_users.output.hits.hits %}
+ - {{hit._source.name}} ({{hit._source.email}})
+ {% endfor %}
+```
+
+## Template rendering behavior [workflows-template-rendering]
+
+The engine renders templates recursively through all data structures, processing nested objects and arrays.
+
+**Input:**
+
+```yaml
+message: "Hello {{user.name}}"
+config:
+ url: "{{api.url}}"
+tags: ["{{tag1}}", "{{tag2}}"]
+```
+
+**Rendered output:**
+
+```yaml
+message: "Hello Alice"
+config:
+ url: "https://api.example.com"
+tags: ["admin", "user"]
+```
+
+### Type handling [workflows-type-handling]
+
+| Type | Behavior |
+|------|----------|
+| Strings | Processed as templates: variables are interpolated, and filters are applied |
+| Numbers, Booleans, Null | Returned as-is |
+| Arrays | Each element is processed recursively |
+| Objects | Each property value is processed recursively (keys are not processed) |
+
+### Null and undefined handling [workflows-null-handling]
+
+| Case | Behavior |
+|------|----------|
+| Null values | Returned as-is |
+| Undefined variables | Returned as empty string in string syntax and as `undefined` in type-preserving syntax |
+| Missing context properties | Treated as undefined |
+
+## Learn more
+
+- [Liquid templating language](https://shopify.github.io/liquid/)
+- [LiquidJS documentation](https://liquidjs.com/)
\ No newline at end of file
diff --git a/explore-analyze/workflows/get-started.md b/explore-analyze/workflows/get-started.md
new file mode 100644
index 0000000000..1e976e7f62
--- /dev/null
+++ b/explore-analyze/workflows/get-started.md
@@ -0,0 +1,327 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn how to get started creating Elastic workflows.
+---
+
+# Get started with workflows [workflows-get-started]
+
+In this tutorial, you'll create a workflow that indexes and searches through national parks data. Along the way, you’ll learn the core concepts and capabilities of workflows.
+
+## Prerequisites [workflows-prerequisites]
+
+- To use workflows, turn on the Elastic Workflows (`workflows:ui:enabled`) [advanced setting](kibana://reference/advanced-settings.md#kibana-general-settings).
+- You must have the appropriate subscription. Refer to the subscription page for [Elastic Cloud](https://www.elastic.co/subscriptions/cloud) and [Elastic Stack/self-managed](https://www.elastic.co/subscriptions) for the breakdown of available features and their associated subscription tiers.
+- Access to workflows is controlled by [{{kib}} privileges](/deploy-manage/users-roles/cluster-or-deployment-auth/kibana-privileges.md). Ensure your role has `All` privileges for **Analytics > Workflows**, which allows you to create, edit, run, and manage workflows.
+
+## Tutorial [workflows-tutorial]
+
+:::::{stepper}
+
+::::{step} Go to Workflows
+
+To access the **Workflows** page, find **Workflows** in the navigation menu or using the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
+
+::::
+
+::::{step} Create a new workflow
+
+Click **Create a new workflow**. The YAML editor opens.
+
+
+::::
+
+::::{step} Define your workflow
+
+Remove the placeholder content and copy and paste the following YAML into the editor:
+
+```yaml
+name: 🏔️ National Parks Demo
+description: Creates an Elasticsearch index, loads sample national park data using bulk operations, searches for parks by category, and displays the results.
+enabled: true
+tags: ["demo", "getting-started"]
+consts:
+ indexName: national-parks
+triggers:
+ - type: manual
+steps:
+ - name: get_index
+ type: elasticsearch.indices.exists
+ with:
+ index: "{{ consts.indexName }}"
+ - name: check_if_index_exists
+ type: if
+ condition: 'steps.get_index.output : true'
+ steps:
+ - name: index_already_exists
+ type: console
+ with:
+ message: "index: {{ consts.indexName }} already exists. Will proceed to delete it and re-create"
+ - name: delete_index
+ type: elasticsearch.indices.delete
+ with:
+ index: "{{ consts.indexName }}"
+ else:
+ - name: no_index_found
+ type: console
+ with:
+ message: "index: {{ consts.indexName }} Not found. Will proceed to create"
+
+ - name: create_parks_index
+ type: elasticsearch.indices.create
+ with:
+ index: "{{ consts.indexName }}"
+ mappings:
+ properties:
+ name: { type: text }
+ category: { type: keyword }
+ description: { type: text }
+ - name: bulk_index_park_data
+ type: elasticsearch.bulk
+ with:
+ index: "{{ consts.indexName }}"
+ operations:
+ - name: "Yellowstone National Park"
+ category: "geothermal"
+ description: "America's first national park, established in 1872, famous for Old Faithful geyser and diverse wildlife including grizzly bears, wolves, and herds of bison and elk."
+
+ - name: "Grand Canyon National Park"
+ category: "canyon"
+ description: "Home to the immense Grand Canyon, a mile deep gorge carved by the Colorado River, revealing millions of years of geological history in its colorful rock layers."
+
+ - name: "Yosemite National Park"
+ category: "mountain"
+ description: "Known for its granite cliffs, waterfalls, clear streams, giant sequoia groves, and biological diversity. El Capitan and Half Dome are iconic rock formations."
+
+ - name: "Zion National Park"
+ category: "canyon"
+ description: "Utah's first national park featuring cream, pink, and red sandstone cliffs soaring into a blue sky. Famous for the Narrows wade through the Virgin River."
+
+ - name: "Rocky Mountain National Park"
+ category: "mountain"
+ description: "Features mountain environments, from wooded forests to mountain tundra, with over 150 riparian lakes and diverse wildlife at various elevations."
+ - name: search_park_data
+ type: elasticsearch.search
+ with:
+ index: "{{ consts.indexName }}"
+ size: 5
+ query:
+ term:
+ category: "canyon"
+ - name: log_results
+ type: console
+ with:
+ message: |-
+ Found {{ steps.search_park_data.output.hits.total.value }} parks in category "canyon".
+ - name: loop_over_results
+ type: foreach
+ foreach: "{{steps.search_park_data.output.hits.hits | json}}"
+ steps:
+ - name: process-item
+ type: console
+ with:
+ message: "{{foreach.item._source.name}}"
+```
+
+::::
+
+::::{step} Save your workflow
+
+Click **Save**. Your workflow is now ready to run.
+
+::::
+
+::::{step} Run your workflow
+
+Click the **Run** icon {icon}`play` (next to **Save**) to execute your workflow.
+
+::::
+
+::::{step} Monitor execution
+
+As your workflow runs, execution logs display in a panel next to your workflow. In the panel, you can find:
+
+* **Real-time execution logs**: Each step appears as it executes.
+* **Worfklow status indicators**: Green for success, red for failures, and timestamps for duration.
+* **Expandable step details**: Click any step to see input, output, and timeline.
+
+::::
+
+::::{step} View execution history
+
+To examine past executions:
+
+1. Click the **Executions** tab.
+2. View a list of all workflow runs (including pending and in progress runs), along with their status and completion time.
+3. Click any execution to see its detailed logs.
+
+
+
+::::
+
+:::::
+
+## Understand what happened
+
+Let's examine each part of the workflow to understand how it works.
+
+:::::{stepper}
+
+::::{step} Workflow metadata
+
+```yaml
+name: 🏔️ National Parks Demo
+description: Creates an Elasticsearch index, loads sample national park data using bulk operations, searches for parks by category, and displays the results.
+enabled: true
+tags: ["demo", "getting-started"]
+```
+
+* **`name`**: A unique identifier for your workflow.
+* **`description`**: Explains the workflow's purpose.
+* **`enabled`**: Controls whether the workflow can be run.
+* **`tags`**: Labels for organizing and finding workflows.
+
+::::
+
+::::{step} Constants
+
+```yaml
+consts:
+ indexName: national-parks-data
+```
+
+* **`consts`**: Defines reusable values that can be referenced throughout the workflow.
+* Accessed using template syntax: `{{ consts.indexName }}`. This promotes consistency and makes the workflow easier to maintain.
+
+::::
+
+::::{step} Triggers
+
+```yaml
+triggers:
+ - type: manual
+```
+
+* **`triggers`**: Defines how the workflow starts.
+* **`type`**: Specifies the trigger type. Manual triggers require explicit user action (clicking the **Run** icon {icon}`play`) to start a workflow.
+
+::::
+
+::::{step} Create index
+
+```yaml
+- name: create_parks_index
+ type: elasticsearch.indices.create
+ with:
+ index: "{{ consts.indexName }}"
+ settings:
+ number_of_shards: 1
+ number_of_replicas: 0
+ mappings:
+ properties:
+ name: { type: text }
+ category: { type: keyword }
+ description: { type: text }
+```
+
+* **Step type**: This is an action step that directly interacts with {{es}}.
+* **Step purpose**: Establishes the data structure for the park information, ensuring fields are properly typed for searching and aggregation.
+* **Key elements**:
+ * Uses `elasticsearch.indices.create`, which is a built-in action that maps to the {{es}} Create Index API.
+ * Defines mappings to control how data is indexed (`text` for full-text search, `keyword` for exact matching).
+ * References the constant `indexName` for consistency.
+ * Sets index settings for optimal performance in this demo.
+
+::::
+
+::::{step} Bulk index documents
+
+```yaml
+- name: bulk_index_park_data
+ type: elasticsearch.bulk
+ with:
+ index: "{{ consts.indexName }}"
+ operations:
+ - name: "Yellowstone National Park"
+ category: "geothermal"
+ description: "America's first national park, established in 1872..."
+ - name: "Grand Canyon National Park"
+ category: "canyon"
+ description: "Home to the immense Grand Canyon..."
+ # ... additional parks
+```
+
+* **Step type**: Another internal action step using {{es}}'s bulk API.
+* **Step purpose**: Efficiently loads multiple documents in a single operation, populating the index with sample data.
+* **Key elements**:
+ * The `operations` array contains the documents to index.
+ * Each document becomes a searchable record in {{es}}.
+ * Uses the field names defined in the mappings (`name`, `category`, `description`).
+ * Each document becomes a searchable record with consistent field structure.
+ * This step demonstrates how to handle batch operations in workflows.
+
+::::
+
+::::{step} Search parks
+
+```yaml
+- name: search_park_data
+ type: elasticsearch.search
+ with:
+ index: "{{ consts.indexName }}"
+ size: 5
+ query:
+ term:
+ category: "canyon"
+```
+
+* **Step type**: Internal action step for querying {{es}}.
+* **Step purpose**: Retrieves specific data based on criteria, demonstrating how workflows can make decisions based on data.
+* **Key elements**:
+ * Searches for parks with category `"canyon"` (will find Grand Canyon and Zion).
+ * Results from `steps.search_park_data.output` are automatically available to subsequent steps.
+ * Limits results to 5 documents for manageable output.
+ * Shows how workflows can filter and process data dynamically.
+
+::::
+
+::::{step} Log results
+
+```yaml
+- name: log_results
+ type: console
+ with:
+ message: |-
+ Found {{ steps.search_park_data.output.hits.total.value }} parks in category "canyon".
+ Top results: {{ steps.search_park_data.output.hits.hits | json(2) }}
+```
+
+* **Step type**: A console step for output and debugging.
+* **Step purpose**: Presents the results in a human-readable format, demonstrating how to access and format data from previous steps.
+* **Key elements**:
+ * Template variables access the search results: `{{ steps.search_park_data.output }}`.
+ * The `| json(2)` filter formats JSON output with indentation.
+ * Uses the exact step name `search_park_data` to reference previous step output.
+ * Shows how data flows through the workflow and can be transformed.
+
+::::
+
+:::::
+
+## Key concepts demonstrated
+
+This workflow introduces several fundamental concepts:
+
+* **Action steps**: Built-in steps that interact with {{es}} and {{kib}} APIs.
+* **Data flow**: How information moves from step to step using outputs and template variables.
+* **Constants**: Reusable values that make workflows maintainable.
+* **Template syntax**: The `{{ }}` notation for dynamic values.
+* **Step chaining**: How each step builds on previous ones to create a complete process.
+
+## What's next
+
+Learn more about the workflow framework:
+* [**Triggers**](./triggers.md): Control when workflows run.
+* [**Steps**](./steps.md): Define how a workflow operates and the outcomes it can produce.
+* [**Data and error handling**](./data.md): Make the workflow resilient to failures and understand mechanisms for controlling data flow.
diff --git a/explore-analyze/workflows/manage-workflows.md b/explore-analyze/workflows/manage-workflows.md
new file mode 100644
index 0000000000..054dbc7e21
--- /dev/null
+++ b/explore-analyze/workflows/manage-workflows.md
@@ -0,0 +1,48 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn how to view, organize, and manage your workflows.
+---
+
+# Manage workflows [workflows-manage]
+
+The **Workflows** page allows you to view and manage all your workflows. From the page, you can create, edit, duplicate, delete, and more with your workflows. To find the **Workflows** page, use the navigation menu or the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
+
+::::{admonition} Requirements
+To use workflows, you must turn on the feature and ensure your role has the appropriate privileges. Refer to [](setup.md) for more information.
+
+You must also have the appropriate subscription. Refer to the subscription page for [Elastic Cloud](https://www.elastic.co/subscriptions/cloud) and [Elastic Stack/self-managed](https://www.elastic.co/subscriptions) for the breakdown of available features and their associated subscription tiers.
+::::
+
+:::{image} /explore-analyze/images/workflows-page.png
+:alt: A view of Workflows editor
+:screenshot:
+:::
+
+## Available actions [workflow-available-actions]
+
+From the Workflows page, you can create new workflows, search and filter existing ones, manually trigger workflows, and more.
+
+### Create a workflow [workflow-create]
+
+Click **Create a new workflow** to open the YAML editor. Refer to [](/explore-analyze/workflows/author-workflows.md) to learn how to use the editor.
+
+### Search and filter [workflow-search-filter]
+
+Use the search bar to filter workflows by name, description, or tag. You can also use the **Enabled** filter to only show workflows that are turned on (enabled) or off (disabled), and the **Created By** filter to only show workflows created by the specified user.
+
+### Run a workflow [workflow-run]
+
+To instantly run a workflow, click the **Run** icon {icon}`play` for a workflow, or open the **All actions** menu ({icon}`boxes_vertical`) and click **Run**. The workflow manually runs regardless of its specified triggers. To learn about monitoring workflow runs, refer to [](/explore-analyze/workflows/monitor-troubleshoot.md).
+
+### Edit a workflow [workflow-edit]
+
+Click the **Edit** icon to open the workflow in the YAML editor. Alternatively, open the **All actions** menu ({icon}`boxes_vertical`), and click **Edit**.
+
+### Turn a workflow on or off [workflow-enable-disable]
+
+Use the **Enabled** toggle to control whether a workflow can run:
+
+- **Enabled**: The workflow responds to its configured triggers.
+- **Disabled**: The workflow won't run, even if it's triggered.
\ No newline at end of file
diff --git a/explore-analyze/workflows/monitor-troubleshoot.md b/explore-analyze/workflows/monitor-troubleshoot.md
new file mode 100644
index 0000000000..12c12edee7
--- /dev/null
+++ b/explore-analyze/workflows/monitor-troubleshoot.md
@@ -0,0 +1,41 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn how to monitor Elastic workflows executions and troubleshoot errors.
+---
+
+# Monitor and troubleshoot workflows [workflows-monitor-troubleshoot]
+
+After you run a workflow, you can track its progress in real time, review past executions, and diagnose any failures. This page explains how to use the execution panel and logs on the **Executions tab** to understand what happened during a workflow run.
+
+::::{admonition} Requirements
+To use workflows, you must turn on the feature and ensure your role has the appropriate privileges. Refer to [](setup.md) for more information.
+
+You must also have the appropriate subscription. Refer to the subscription page for [Elastic Cloud](https://www.elastic.co/subscriptions/cloud) and [Elastic Stack/self-managed](https://www.elastic.co/subscriptions) for the breakdown of available features and their associated subscription tiers.
+::::
+
+## Monitor execution [workflows-monitor-execution]
+
+When a workflow runs, the execution panel displays:
+
+- **Real-time logs**: Each step appears as it executes.
+- **Status indicators**: Green indicates success and red represents failure.
+- **Timestamps**: The duration of each step.
+- **Expandable details**: Click any step to examine details such as input parameters, output data, and execution timelines.
+
+## View execution history [workflows-execution-history]
+
+To review past runs, select the **Executions** tab, then click each run to see detailed logs. Workflow runs can be `Pending`, `In progress`, `Completed`, or `Failed`.
+
+## Troubleshoot errors [workflows-troubleshoot-errors]
+
+When a workflow fails, open the failed execution from the **Executions** tab, then find the step with the error indicator. Expand the step to view the error message and to learn more about the root cause, such the input that caused the failure. After fixing an error, save the workflow before running it again.
+
+Common issues that can cause failures:
+
+| Issue | Cause | Solution |
+|-------|-------|----------|
+| Syntax error | Invalid YAML | Check indentation and formatting. |
+| Step failed | Action error | Review step configuration and inputs. |
+| Missing variable | Undefined reference | Verify variable names and data flow. |
\ No newline at end of file
diff --git a/explore-analyze/workflows/setup.md b/explore-analyze/workflows/setup.md
new file mode 100644
index 0000000000..5998711eee
--- /dev/null
+++ b/explore-analyze/workflows/setup.md
@@ -0,0 +1,36 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn how to enable workflows and configure role-based access.
+---
+
+# Set up workflows [workflows-setup]
+
+To use workflows, you must turn on the feature and ensure your role has the appropriate privileges. You must also have the appropriate subscription. Refer to the subscription page for [Elastic Cloud](https://www.elastic.co/subscriptions/cloud) and [Elastic Stack/self-managed](https://www.elastic.co/subscriptions) for the breakdown of available features and their associated subscription tiers.
+
+## Enable workflows [workflows-enable]
+
+The workflows feature is turned off by default. To turn it on:
+
+1. Go to the **Advanced Settings** management page in the navigation menu or using the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
+2. Search for `workflows:ui:enabled`.
+3. Toggle the setting on.
+4. Click **Save changes** to turn on workflows in your space, then reload the page.
+
+The **Workflows** page displays in the main navigation menu and you can search for it using the global search field.
+
+## Manage access to workflows [workflows-role-access]
+
+Access to workflows is controlled by [{{kib}} privileges](/deploy-manage/users-roles/cluster-or-deployment-auth/kibana-privileges.md). The following table describes privileges required to create, edit, run, and manage workflows.
+
+| Action | Required privilege |
+|--------|-------------------|
+| Access the **Workflows** page | `All` or `Read` for **Analytics > Workflows** |
+| Fully manage workflows | `All` for **Analytics > Workflows** |
+| Grant access to specific workflow actions | Set sub-feature privileges for **Analytics > Workflows** |
+
+## What's next [workflows-what-next]
+
+- Create and run your first workflow. Refer to [](get-started.md) to learn more
+- Understand how to use the YAML editor in {{kib}} to define and run workflows. Refer to [](author-workflows.md) to learn more.
diff --git a/explore-analyze/workflows/steps.md b/explore-analyze/workflows/steps.md
new file mode 100644
index 0000000000..8ef4e4e3da
--- /dev/null
+++ b/explore-analyze/workflows/steps.md
@@ -0,0 +1,59 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about workflow steps, the building blocks that define how workflows operate and produce outcomes.
+---
+
+# Steps
+
+Workflow steps are the fundamental building blocks of automation. Each step represents a single unit of logic, action, transformation, or reasoning. Together, they define how a workflow operates and what outcomes it can produce. Steps are chained together to move data, coordinate logic, and drive results.
+
+Workflow steps are grouped into the following categories based on their function within the automation.
+
+## Action steps
+
+Action steps carry out operations in internal or external systems. They produce real-world outcomes by performing tasks such as:
+
+* Interact with Elastic features across solutions, including common operations like:
+ * Querying data from {{es}} or data streams
+ * Indexing new documents or updating existing fields
+ * Closing or updating cases
+ * Enriching alerts with additional context
+ * Modifying dashboards or saved objects
+* Trigger actions in external systems using APIs, integrations, or service connectors
+* Send messages, alerts, or notifications to systems such as Slack or email
+* Invoke other workflows
+
+These actions are available as pre-built operations, so you don't need to configure API endpoints or manage authentication details. You select the action you want to perform and provide the required parameters.
+
+Refer to [](/explore-analyze/workflows/steps/action-steps.md) for more information.
+
+
+## Flow control steps
+
+Flow control steps define how a workflow runs. They control the order, structure, and branching logic of execution. This includes:
+
+* **Conditional logic**: Execute certain steps only when conditions are met
+* **Pauses and waits**: Introduce delays or time-based holds
+* **Early exits**: Skip or halt execution when needed
+
+These steps make workflows dynamic and responsive, allowing them to adapt in real time to data and conditions.
+
+Refer to [](/explore-analyze/workflows/steps/flow-control-steps.md) for more information.
+
+## AI steps
+
+AI steps introduce reasoning and language understanding into workflows. Use AI steps to process natural language, make context-aware decisions, or operate through agents:
+
+* Summarize or interpret information using a large language model
+* Extract key insights from unstructured data
+* Send prompts to an AI connector using the `ai.prompt` step
+* Call a built-in or custom Elastic AI agent using the `ai.agent` step
+* Integrate with LLM providers such as OpenAI and Gemini
+
+### {{agent-builder}} integration
+
+In addition to calling Elastic AI agents from within workflows, agents built with {{agent-builder}} can also trigger workflows. To enable this, create a custom workflow tool type and assign it to an agent. The agent can then trigger the workflow from a conversation.
+
+Refer to [](/explore-analyze/ai-features/agent-builder/tools/workflow-tools.md) and [](/explore-analyze/ai-features/agent-builder/agents-and-workflows.md) for more information.
diff --git a/explore-analyze/workflows/steps/action-steps.md b/explore-analyze/workflows/steps/action-steps.md
new file mode 100644
index 0000000000..3fe594381b
--- /dev/null
+++ b/explore-analyze/workflows/steps/action-steps.md
@@ -0,0 +1,44 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about action steps that perform tasks in your workflows.
+---
+
+# Action steps
+
+Action steps are the building blocks that perform tasks in your workflows. They are the operations that do the work, such as searching data, calling an API, sending a notification, or interacting with external systems.
+
+Action steps are organized into the following categories.
+
+## {{es}}
+
+{{es}} actions provide native integration with {{es}} APIs. These actions are automatically authenticated and offer a simplified interface for common operations. Use {{es}} actions to:
+
+* Search and query data
+* Index new documents
+* Update or delete existing documents
+* Manage indices and data streams
+
+Refer to [](/explore-analyze/workflows/steps/elasticsearch.md) for more information.
+
+## {{kib}}
+
+{{kib}} actions provide native integration with {{kib}} APIs. Like {{es}} actions, they are automatically authenticated and simplify common operations. Use {{kib}} actions to:
+
+* Create or update cases
+* Manage alerts
+* Interact with saved objects and other {{kib}} features
+
+Refer to [](/explore-analyze/workflows/steps/kibana.md) for more information.
+
+## External systems and apps
+
+External actions allow your workflows to communicate with third-party systems using connectors. Use external actions to:
+
+* Send notifications to Slack or email
+* Create incidents in ServiceNow
+* Create issues in Jira
+* Call any external API using HTTP requests
+
+Refer to [](/explore-analyze/workflows/steps/external-systems-apps.md) for more information.
diff --git a/explore-analyze/workflows/steps/elasticsearch.md b/explore-analyze/workflows/steps/elasticsearch.md
new file mode 100644
index 0000000000..4ca20e1514
--- /dev/null
+++ b/explore-analyze/workflows/steps/elasticsearch.md
@@ -0,0 +1,167 @@
+---
+navigation_title: Elasticsearch
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about Elasticsearch action steps for searching, indexing, and managing data in workflows.
+---
+
+# {{es}} action steps
+
+{{es}} actions are built-in steps that allow your workflows to interact directly with {{es}} APIs. You can search, index, update, and delete documents, manage indices, and perform any other operation supported by the {{es}} REST API.
+
+All {{es}} actions are automatically authenticated using the permissions or API key of the user executing the workflow.
+
+There are two ways to use {{es}} actions:
+
+* [Named actions](#named-actions): Structured actions that map directly to specific {{es}} API endpoints
+* [Generic request actions](#generic-request-actions): Actions that provide full control over the HTTP request for advanced use cases
+
+## Named actions
+
+Named actions provide a structured way to call specific {{es}} endpoints. The action type maps directly to the {{es}} API.
+
+To view the available named actions, click **Actions menu** and select **{{es}}**. For operations that are not available as a named action, use the [generic request action](#generic-request-actions).
+
+The following table shows some examples:
+
+| Action type | {{es}} operation |
+|-------------|--------------|
+| `elasticsearch.search` | `POST //_search` ([Run a search]({{es-apis}}operation/operation-search)) |
+| `elasticsearch.delete` | `DELETE //_doc/` ([Delete a document]({{es-apis}}operation/operation-delete)) |
+| `elasticsearch.indices.create` | `PUT /` ([Create an index]({{es-apis}}operation/operation-indices-create)) |
+
+The parameters you provide in the `with` block are passed as the body or query parameters of the API request. The following examples demonstrate common use cases.
+
+### Example: Search for documents
+
+The `elasticsearch.search` action searches for documents in the specified index. The `query` parameter is passed directly to the [Run a search API]({{es-apis}}operation/operation-search).
+
+```yaml
+steps:
+ - name: search_for_alerts
+ type: elasticsearch.search
+ with:
+ index: ".alerts-security.attack.discovery*"
+ query:
+ bool:
+ filter:
+ - term:
+ kibana.alert.severity: "critical"
+```
+
+### Example: Delete a document
+
+The `elasticsearch.delete` action deletes a single document by its ID. The `index` and `id` parameters are used to construct the API path.
+
+```yaml
+steps:
+ - name: delete_a_doc
+ type: elasticsearch.delete
+ with:
+ index: "my-index"
+ id: "document_id_123"
+```
+
+### Example: Bulk indexing
+
+The `elasticsearch.bulk` action performs multiple indexing or delete operations in a single request. The `body` parameter must be a string containing the bulk operations in newline-delimited JSON (NDJSON) format. Each operation requires an action/metadata line followed by an optional source document line.
+
+```yaml
+steps:
+ - name: bulk_index_data
+ type: elasticsearch.bulk
+ with:
+ index: "national-parks-data"
+ body: |
+ { "index": { "_id": "1" } } <1>
+ { "name": "Yellowstone National Park", "category": "geothermal" } <2>
+ { "index": { "_id": "2" } } <1>
+ { "name": "Grand Canyon National Park", "category": "canyon" } <2>
+```
+1. **Action/metadata line**: Specifies the action and document ID
+2. **Source document line**: The document data
+
+## Generic request actions
+
+For advanced use cases or for accessing [{{es}} APIs]({{es-apis}}) that do not have a named action, use the generic `elasticsearch.request` type. This gives you full control over the HTTP request.
+
+::::{note}
+We recommend using named actions whenever possible. They are more readable and provide a stable interface for common operations.
+::::
+
+Use the following parameters in the `with` block to configure the request:
+
+| Parameter | Required | Description |
+|-----------|----------|-------------|
+| `method` | No (defaults to `GET`) | The HTTP method (`GET`, `POST`, `PUT`, or `DELETE`) |
+| `path` | Yes | The API endpoint path (for example, `/_search`, `/_cluster/health`) |
+| `body` | No | The JSON request body |
+| `query` | No | An object representing URL query string parameters |
+
+### Example: Get cluster health
+
+This example uses the generic request to call the `GET /_cluster/health` endpoint ([Get cluster health]({{es-apis}}operation/operation-health-report)).
+
+```yaml
+steps:
+ - name: get_cluster_health
+ type: elasticsearch.request
+ with:
+ method: GET
+ path: /_cluster/health
+```
+
+### Example: Delete documents by query
+
+This example uses the generic request to call the `POST //_delete_by_query` endpoint ([Delete documents]({{es-apis}}operation/operation-delete-by-query)).
+
+```yaml
+steps:
+ - name: delete_old_documents
+ type: elasticsearch.request
+ with:
+ method: POST
+ path: /my-index/_delete_by_query
+ body:
+ query:
+ range:
+ "@timestamp":
+ lt: "now-30d"
+```
+
+## Combine actions
+
+The following example demonstrates how to combine multiple {{es}} actions in a workflow. It searches for documents and then iterates over the results to delete each one.
+
+```yaml
+name: Search and Delete Documents
+triggers:
+ - type: manual
+steps:
+ - name: search_for_docs
+ type: elasticsearch.search
+ with:
+ index: ".alerts-security.attack.discovery.alerts-default"
+ query:
+ term:
+ host.name: "compromised-host"
+
+ - name: delete_found_docs
+ type: foreach
+ # The search results are in steps.search_for_docs.output
+ foreach: steps.search_for_docs.output.hits.hits
+ steps:
+ - name: delete_each_doc
+ type: elasticsearch.delete
+ with:
+ # The 'item' variable holds the current document from the loop
+ index: "{{ item._index }}"
+ id: "{{ item._id }}"
+```
+
+Key concepts in this example:
+
+* [Data flow](/explore-analyze/workflows/data.md#workflows-data-flow): The output of the `search_for_docs` step is available to subsequent steps at `steps.search_for_docs.output`.
+* [Foreach loop](/explore-analyze/workflows/steps/foreach.md): The `foreach` step iterates over the `hits.hits` array from the search results.
+* [Item variable](/explore-analyze/workflows/data/templating.md): Inside the loop, the `item` variable holds the current document being processed, allowing you to access its fields such as `item._index` and `item._id`.
diff --git a/explore-analyze/workflows/steps/external-systems-apps.md b/explore-analyze/workflows/steps/external-systems-apps.md
new file mode 100644
index 0000000000..7a0451c7b5
--- /dev/null
+++ b/explore-analyze/workflows/steps/external-systems-apps.md
@@ -0,0 +1,107 @@
+---
+navigation_title: External systems and apps
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about action steps for interacting with external systems such as Slack or Jira.
+---
+
+# External systems and apps steps
+
+External systems actions allow your workflows to communicate with third-party services and custom endpoints. You can interact with external systems in the following ways:
+
+* [Connector-based actions](#connector-based-actions): Use pre-configured connectors to integrate with services such as Slack and {{jira}}
+* [HTTP actions](#http-actions): Make direct HTTP requests to any API endpoint
+
+## Connector-based actions
+
+Connector-based actions use {{kib}}'s centralized {{connectors-ui}} framework. Before using them, you must first [configure a connector](/deploy-manage/manage-connectors.md).
+
+The step `type` is a keyword for the service (for example, `slack` or `jira`). You must also provide a `connector-id` at the same level as `type`.
+
+To view the available connectors, click **Actions menu** and select **External Systems & Apps**.
+
+### Identify a connector
+
+The `connector-id` field accepts one of the following:
+
+* The unique name you gave the connector (for example, `"my-slack-connector"`). This is the recommended method for readability.
+* The connector's raw ID (for example, `"d6b62e80-ff9b-11ee-8678-0f2b2c0c3c68"`).
+
+### Example: Send a Slack notification
+
+This example uses a pre-configured Slack connector named `"security-alerts-channel"`.
+
+```yaml
+steps:
+ - name: notify_security_channel
+ type: slack
+ connector-id: "security-alerts-channel"
+ with:
+ message: "High-priority alert: {{ event.name }}. Please investigate immediately."
+```
+
+### Example: Create a {{jira}} issue
+
+This example uses a {{jira}} connector named `"engineering-project"`.
+
+```yaml
+steps:
+ - name: create_jira_ticket
+ type: jira
+ connector-id: "engineering-project"
+ with:
+ projectKey: "ENG"
+ issueType: "Task"
+ summary: "Automated Task: Review '{{ event.name }}'"
+ description: "Workflow '{{ workflow.name }}' requires manual review for a potential issue."
+```
+
+## HTTP actions
+
+The native `http` action is a built-in HTTP client that does not require a pre-configured connector. Use it for one-off requests to public or internal APIs.
+
+Use the following parameters in the `with` block to configure the request:
+
+| Parameter | Required | Description |
+|-----------|----------|-------------|
+| `url` | Yes | The full URL of the endpoint to call |
+| `method` | No (defaults to `GET`) | The HTTP method (`GET`, `POST`, `PUT`, or `DELETE`) |
+| `headers` | No | An object with key-value pairs for HTTP headers |
+| `body` | No | The request body (typically a JSON object) |
+
+::::{admonition} Known limitation
+The native `http` action does not have access to a centralized secret store for managing authentication credentials. If your endpoint requires authentication, you must include the credentials directly in the `headers` block.
+
+:::{dropdown} Click to show syntax example
+```yaml
+steps:
+ - name: call_secure_api
+ type: http
+ with:
+ url: "https://api.thirdparty.com/v1/data"
+ method: "GET"
+ headers:
+ Authorization: "Bearer my-secret-api-token"
+```
+:::
+::::
+
+### Example: Call a custom webhook
+
+This example makes a POST request to a custom automation endpoint, passing data from the workflow context.
+
+```yaml
+steps:
+ - name: trigger_custom_automation
+ type: http
+ with:
+ url: "https://hooks.example.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX"
+ method: "POST"
+ headers:
+ Content-Type: "application/json"
+ body:
+ event_id: "{{ event.id }}"
+ message: "Workflow action triggered by '{{ workflow.name }}'"
+```
+
diff --git a/explore-analyze/workflows/steps/flow-control-steps.md b/explore-analyze/workflows/steps/flow-control-steps.md
new file mode 100644
index 0000000000..59d79eeaa0
--- /dev/null
+++ b/explore-analyze/workflows/steps/flow-control-steps.md
@@ -0,0 +1,63 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about flow control steps for controlling workflow execution order and logic.
+---
+
+# Flow control steps
+
+Flow control steps allow you to add logic, conditionals, and loops to your workflows, making them dynamic and responsive to data. Use them to run different steps based on conditions, process items in bulk, or control timing.
+
+The following flow control steps are available:
+
+* **Conditional execution** (`if`): Run different steps based on boolean or {{kib}} Query Language (KQL) expressions
+* **Loops and iteration** (`foreach`): Iterate over arrays or collections
+* **Execution control** (`wait`): Pause step execution for a specified duration
+
+## If
+
+The `if` step evaluates a boolean or KQL expression and runs different steps based on whether the condition is true or false.
+
+```yaml
+steps:
+ - name: conditionalStep
+ type: if
+ condition:
+ steps:
+ # Steps to run if condition is true
+ else:
+ # Steps to run if condition is false (optional)
+```
+
+Refer to [](/explore-analyze/workflows/steps/if.md) for more information.
+
+## Foreach
+
+The `foreach` step iterates over an array, running a set of steps for each item in the collection.
+
+```yaml
+steps:
+ - name: loopStep
+ type: foreach
+ foreach:
+ steps:
+ # Steps to run for each item
+ # Current item is available as 'foreach.item'
+```
+
+Refer to [](/explore-analyze/workflows/steps/foreach.md) for more information.
+
+## Wait
+
+The `wait` step pauses workflow execution for a specified duration before continuing to the next step.
+
+```yaml
+steps:
+ - name: waitStep
+ type: wait
+ with:
+ duration: "5s"
+```
+
+Refer to [](/explore-analyze/workflows/steps/wait.md) for more information.
diff --git a/explore-analyze/workflows/steps/foreach.md b/explore-analyze/workflows/steps/foreach.md
new file mode 100644
index 0000000000..951a1eecd0
--- /dev/null
+++ b/explore-analyze/workflows/steps/foreach.md
@@ -0,0 +1,150 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about the foreach step for iterating over data in workflows.
+---
+
+# Foreach
+
+The `foreach` step iterates over an array and runs its nested steps once for each item in the array.
+
+Use the following parameters to configure a `foreach` step:
+
+| Parameter | Required | Description |
+|-----------|----------|-------------|
+| `name` | Yes | Unique step identifier |
+| `type` | Yes | Step type - must be `foreach` |
+| `foreach` | Yes | A template or JSON expression that evaluates to an array |
+| `steps` | Yes | An array of steps to run for each iteration |
+
+```yaml
+steps:
+ - name: loopStep
+ type: foreach
+ foreach:
+ steps:
+ # Steps to run for each item
+ # Current item is available as 'foreach.item'
+```
+
+::::{note}
+Inside the loop, the current item is always available as `foreach.item`. You cannot customize this variable name.
+::::
+
+The `foreach` field supports the following expression types:
+
+* [Template expressions](#template-expressions)
+* [JSON strings](#json-strings)
+* [JSON strings with templates](#json-strings-with-templates)
+
+## Template expressions
+
+Use `{{ }}` or `${{ }}` syntax when the array comes from context variables such as step outputs, inputs, or constants. Both syntaxes behave identically for `foreach`:
+
+```yaml
+foreach: "{{ steps.getData.output.items }}"
+foreach: "${{ steps.getData.output.items }}"
+```
+
+## JSON strings
+
+Use a plain JSON array string for static arrays known at definition time:
+
+```yaml
+foreach: '["item1", "item2", "item3"]'
+```
+
+## JSON strings with templates
+
+Use a JSON string containing `{{ }}` template expressions for dynamically built arrays with a known structure:
+
+```yaml
+foreach: '[{{ steps.getCount }}, {{ steps.getCount | plus: 1 }}]'
+```
+
+::::{note}
+Avoid using plain property paths without template syntax (for example, `foreach: 'consts.items'`). Use `foreach: "{{ consts.items }}"` instead.
+::::
+
+## Context variables
+
+The workflow engine automatically provides the following variables during `foreach` iteration. To use these variables, reference them in your step parameters with `{{ }}` syntax:
+
+| Variable | Description |
+|----------|-------------|
+| `foreach.item` | Current item in the iteration |
+| `foreach.index` | Zero-based index of the current iteration |
+| `foreach.total` | Total number of items in the array |
+| `foreach.items` | Complete array being iterated over |
+
+Example:
+
+```yaml
+message: "Processing {{ foreach.item.name }} ({{ foreach.index | plus: 1 }}/{{ foreach.total }})"
+```
+
+### Access parent context
+
+Nested `foreach` loops can access parent context using step references:
+
+```yaml
+steps:
+ - name: outer-foreach
+ type: foreach
+ foreach: "{{ outerItems }}"
+ steps:
+ - name: inner-foreach
+ type: foreach
+ foreach: "{{ innerItems }}"
+ steps:
+ - name: log-both
+ type: console
+ with:
+ message: "Outer: {{ steps.outer-foreach.index }}, Inner: {{ foreach.index }}"
+```
+
+### Access keys with dots
+
+Template expressions support bracket notation for keys that contain dots or other special characters:
+
+```yaml
+"{{ foreach.item['service.name'] }}"
+```
+
+
+## Example: Process search results
+
+This example searches for documents and enriches each result with metadata:
+
+```yaml
+name: National Parks Enrichment
+description: Enrich each park with additional data
+steps:
+ - name: searchAllParks
+ type: elasticsearch.search
+ with:
+ index: national-parks-index
+ size: 100
+ query:
+ match_all: {}
+
+ - name: enrichEachPark
+ type: foreach
+ foreach: "{{ steps.searchAllParks.output.hits.hits }}"
+ steps:
+ - name: logProcessing
+ type: console
+ with:
+ message: "Processing park: {{ foreach.item._source.title }}"
+
+ - name: addMetadata
+ type: elasticsearch.update
+ with:
+ index: national-parks-index
+ id: "{{ foreach.item._id }}"
+ doc:
+ last_processed: "{{ execution.startedAt }}"
+ workflow_run: "{{ execution.id }}"
+ category_uppercase: "{{ foreach.item._source.category | upcase }}"
+```
diff --git a/explore-analyze/workflows/steps/if.md b/explore-analyze/workflows/steps/if.md
new file mode 100644
index 0000000000..ee844b5293
--- /dev/null
+++ b/explore-analyze/workflows/steps/if.md
@@ -0,0 +1,188 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about the if step for conditional logic in workflows.
+---
+
+# If
+
+The `if` step evaluates a boolean or {{kib}} Query Language (KQL) expression and runs different steps based on whether the condition is true or false.
+
+Use the following parameters to configure an `if` step:
+
+| Parameter | Required | Description |
+|-----------|----------|-------------|
+| `name` | Yes | Unique step identifier |
+| `type` | Yes | Step type - must be `if` |
+| `condition` | Yes | A boolean or KQL expression to evaluate |
+| `steps` | Yes | An array of steps to run if the condition is true |
+| `else` | No | An array of steps to run if the condition is false |
+
+```yaml
+steps:
+ - name: conditionalStep
+ type: if
+ condition:
+ steps:
+ # Steps to run if condition is true
+ else:
+ # Steps to run if condition is false (optional)
+```
+
+The `condition` field supports the following expression types:
+
+* [Boolean expressions](#boolean-expressions)
+* [KQL expressions](#kql-expressions)
+
+## Boolean expressions
+
+Use `${{ }}` syntax when you want the expression to evaluate directly to a boolean value:
+
+```yaml
+steps:
+ - name: check-enabled
+ type: if
+ condition: "${{ inputs.isEnabled }}"
+ steps:
+ - name: process-enabled
+ type: http
+ else:
+ - name: log-disabled
+ type: console
+```
+
+If the expression evaluates to `undefined`, it defaults to `false`.
+
+## KQL expressions
+
+Use a string-based condition to evaluate the value as a KQL expression. You can use `{{ }}` templating to inject dynamic values:
+
+```yaml
+steps:
+ - name: check-status
+ type: if
+ condition: "{{ steps.fetchData.output.status }}: completed"
+ steps:
+ - name: process-data
+ type: http
+```
+
+### Supported KQL features
+
+The `if` step supports the following KQL features:
+
+#### Equality checks
+
+```yaml
+condition: "status: active"
+condition: "user.role: admin"
+condition: "isActive: true"
+condition: "count: 42"
+condition: "users[0].name: Alice" # Array index access
+```
+
+#### Range operators
+
+```yaml
+condition: "count >= 100"
+condition: "count <= 1000"
+condition: "count > 50"
+condition: "count < 200"
+condition: "count >= 100 and count <= 1000"
+```
+
+#### Wildcard matching
+
+```yaml
+condition: "fieldName:*" # Field exists
+condition: "user.name: John*" # Starts with
+condition: "user.name: *Doe" # Ends with
+condition: "txt: *ipsum*" # Contains
+condition: "user.name: J*n Doe" # Pattern
+```
+
+#### Logical operators
+
+```yaml
+condition: "status: active and isEnabled: true" # And
+condition: "status: active or status: pending" # Or
+condition: "not status: inactive" # Not
+condition: "status: active and (role: admin or role: moderator)" # Nested
+```
+
+#### Property path access
+
+```yaml
+condition: "user.info.name: John Doe" # Nested property
+condition: "steps.fetchData.output.status: completed" # Deep nesting
+condition: "users[0].name: Alice" # Array access
+condition: "users.0.name: Alice" # Alternative syntax
+```
+
+### Example: Check severity
+
+This example runs different steps based on the event severity:
+
+```yaml
+steps:
+ - name: checkSeverity
+ type: if
+ condition: event.severity: 'critical'
+ steps:
+ - name: handleCritical
+ type: console
+ with:
+ message: "Critical alert!"
+ else:
+ - name: handleNormal
+ type: console
+ with:
+ message: "Normal severity"
+```
+
+### Example: Check search results count
+
+This example checks the number of search results and processes them differently based on the count:
+
+```yaml
+name: National Parks Conditional Processing
+steps:
+ - name: searchParks
+ type: elasticsearch.search
+ with:
+ index: national-parks-index
+ size: 100
+
+ - name: checkResultCount
+ type: if
+ condition: "steps.searchParks.output.hits.total.value > 5"
+ steps:
+ - name: processLargeDataset
+ type: foreach
+ foreach: "{{ steps.searchParks.output.hits.hits }}"
+ steps:
+ - name: processPark
+ type: console
+ with:
+ message: "Processing park: {{ foreach.item._source.title }}"
+ else:
+ - name: handleSmallDataset
+ type: console
+ with:
+ message: "Only {{ steps.searchParks.output.hits.total.value }} parks found - manual review needed"
+```
+
+### Example: Complex KQL condition
+
+This example uses multiple logical operators to check a combination of conditions:
+
+```yaml
+steps:
+ - name: check-complex
+ type: if
+ condition: "status: active and (count >= 100 or role: admin)"
+ steps:
+ - name: process-authorized
+ type: http
+```
diff --git a/explore-analyze/workflows/steps/kibana.md b/explore-analyze/workflows/steps/kibana.md
new file mode 100644
index 0000000000..67ff855942
--- /dev/null
+++ b/explore-analyze/workflows/steps/kibana.md
@@ -0,0 +1,88 @@
+---
+navigation_title: Kibana
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about Kibana action steps for automating tasks such as creating cases and managing alerts in workflows.
+---
+
+# {{kib}} action steps
+
+{{kib}} actions are built-in steps that allow your workflows to interact with {{kib}} APIs. You can automate tasks such as creating cases, updating alerts, or interacting with other {{kib}} features.
+
+All {{kib}} actions are automatically authenticated using the permissions or API key of the user executing the workflow.
+
+There are two ways to use {{kib}} actions:
+
+* [Named actions](#named-actions): Common {{kib}} operations accessible through a simplified, high-level interface
+* [Generic request actions](#generic-request-actions): Actions that provide full control over the HTTP request for advanced use cases
+
+## Named actions
+
+Named actions provide a simplified, high-level interface for common {{kib}} operations. Each action type corresponds to a specific {{kib}} function.
+
+To view the available named actions, click **Actions menu** and select **{{kib}}**. For operations that are not available as a named action, use the [generic request action](#generic-request-actions).
+
+The following example demonstrates a common use case.
+
+### Example: Create a case
+
+The `kibana.createCaseDefaultSpace` action opens a new security case. The parameters in the `with` block are specific to this action.
+
+```yaml
+steps:
+ - name: create_a_case
+ type: kibana.createCaseDefaultSpace
+ with:
+ title: "Suspicious Login Detected"
+ description: "Automated case created by workflow. Host '{{ event.alerts[0].host.name }}' exhibited unusual activity."
+ tags: ["workflow", "automated-response"]
+ severity: "critical"
+ connector:
+ id: "none"
+ name: "none"
+ type: ".none"
+```
+
+## Generic request actions
+
+The generic `kibana.request` type gives you full control over the HTTP request. Use it for:
+
+* Accessing [{{kib}} APIs]({{kib-apis}}) that do not have a named action
+* Advanced use cases that require specific headers or query parameters not exposed by a named action
+
+::::{note}
+We recommend using named actions whenever possible. They are more readable and provide a stable interface for common operations.
+::::
+
+Use the following parameters in the `with` block to configure the request:
+
+| Parameter | Required | Description |
+|-----------|----------|-------------|
+| `method` | No (defaults to `GET`) | The HTTP method (`GET`, `POST`, `PUT`, or `DELETE`) |
+| `path` | Yes | The API endpoint path, starting with `/api/` or `/internal/` |
+| `body` | No | The JSON request body |
+| `query` | No | An object representing URL query string parameters |
+| `headers` | No | Custom HTTP headers to include in the request. `kbn-xsrf` and `Content-Type` are added automatically |
+
+::::{note}
+You do not need to pass an `Authorization` header. The workflow engine automatically attaches the correct authentication headers based on the execution context. Do not manage or pass API keys or secrets in the `headers` block.
+::::
+
+### Example: Unisolate an endpoint
+
+This example uses the generic request action to call the Security endpoint management API to unisolate a host ([Release an isolated endpoint]({{kib-apis}}operation/operation-endpointunisolateaction)).
+
+```yaml
+steps:
+ - name: unisolate_endpoint_with_case
+ type: kibana.request
+ with:
+ method: POST
+ path: /api/endpoint/action/unisolate
+ body:
+ endpoint_ids: ["{{event.alerts[0].elastic.agent.id}}"]
+ comment: "Unisolating endpoint as part of automated cleanup."
+```
+
+
diff --git a/explore-analyze/workflows/steps/wait.md b/explore-analyze/workflows/steps/wait.md
new file mode 100644
index 0000000000..ba9f2a9788
--- /dev/null
+++ b/explore-analyze/workflows/steps/wait.md
@@ -0,0 +1,97 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about the wait step for adding delays in workflows.
+---
+
+# Wait
+
+The `wait` step pauses workflow execution for a specified duration before continuing to the next step.
+
+Use the following parameters to configure a `wait` step:
+
+| Parameter | Required | Description |
+|-----------|----------|-------------|
+| `name` | Yes | Unique step identifier |
+| `type` | Yes | Step type - must be `wait` |
+| `with.duration` | Yes | Duration to wait before continuing (for example, `"5s"`) |
+
+```yaml
+steps:
+ - name: waitStep
+ type: wait
+ with:
+ duration: "5s"
+```
+
+## Duration format
+
+The supported units are:
+
+* Weeks: `w`
+* Days: `d`
+* Hours: `h`
+* Minutes: `m`
+* Seconds: `s`
+* Milliseconds: `ms`
+
+Duration strings must follow the following format rules:
+
+* Units must be in descending order: `1w2d3h4m5s6ms`
+* Each unit can only appear once
+* No spaces between number and unit
+* Positive integer values only (no decimals, commas, negative values, or zero)
+
+```yaml
+duration: "1w"
+duration: "2d12h"
+duration: "1d"
+duration: "1h30m"
+duration: "1h"
+duration: "5m30s"
+duration: "2m"
+duration: "30s"
+duration: "2s500ms"
+duration: "500ms"
+duration: "1w3d5h20m10s"
+```
+
+## Examples
+
+Wait for 10 seconds:
+
+```yaml
+steps:
+ - name: delay
+ type: wait
+ with:
+ duration: "10s"
+```
+
+Wait for one minute after the API call completes:
+
+```yaml
+steps:
+ - name: api-call
+ type: http
+ on-failure:
+ retry:
+ max-attempts: 3
+ delay: "5s"
+
+ - name: wait-before-next
+ type: wait
+ with:
+ duration: "1m"
+```
+
+Wait for one day:
+
+```yaml
+steps:
+ - name: wait-one-day
+ type: wait
+ with:
+ duration: "1d"
+```
diff --git a/explore-analyze/workflows/templates.md b/explore-analyze/workflows/templates.md
new file mode 100644
index 0000000000..f196dfec5d
--- /dev/null
+++ b/explore-analyze/workflows/templates.md
@@ -0,0 +1,17 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Explore pre-built workflow templates to jumpstart your automation.
+---
+
+# Workflow templates [workflows-templates]
+
+Workflow templates are pre-built workflows that you can use as a starting point. Instead of building automations from scratch, pick a template and customize it to fit your needs. For example, templates are useful for:
+- Automating a common task
+- Learning how workflows work
+- Saving time with a ready-made example of a common automation pattern
+
+## Access templates [workflows-templates-access]
+
+[Browse](https://github.com/elastic/workflows/) available templates and examples to find one that matches your use case. Refer to the readme to get started using templates.
\ No newline at end of file
diff --git a/explore-analyze/workflows/triggers.md b/explore-analyze/workflows/triggers.md
new file mode 100644
index 0000000000..22c1127e05
--- /dev/null
+++ b/explore-analyze/workflows/triggers.md
@@ -0,0 +1,105 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about workflow triggers and how to create and configure them.
+---
+
+# Triggers
+
+Triggers determine when your workflows start executing. Every workflow must have at least one trigger defined.
+
+A trigger is an event or condition that initiates a workflow. Without a trigger, a workflow remains dormant. Triggers connect workflows to real-world signals, schedules, or user actions.
+
+Triggers also provide initial context to the workflow. For example, a workflow triggered by an alert carries the alert's metadata, entities, and source events. This context shapes how the workflow executes.
+
+## Trigger types
+
+The following types of triggers are available:
+* [Manual triggers](#manual-triggers)
+* [Scheduled triggers](#scheduled-triggers)
+* [Alert triggers](#alert-triggers)
+
+### Manual triggers
+
+Manual triggers run workflows on-demand through the UI or API. They require explicit user action to start the workflow.
+
+Use manual triggers for:
+
+* Testing and development
+* One-off data processing tasks
+* Administrative actions
+* Workflows that require a human decision to start
+
+Manual trigger example:
+
+```yaml
+triggers:
+ - type: manual
+```
+
+Refer to [](/explore-analyze/workflows/triggers/manual-triggers.md) for more information.
+
+### Scheduled triggers
+
+Scheduled triggers run workflows automatically at specific times or intervals. You can configure schedules using:
+
+* Intervals: Run every _x_ minutes, hours, or days
+* RRule expressions: Run at specific times (for example, daily at 2 AM)
+
+Use scheduled triggers for:
+
+* Daily reports
+* Regular data cleanup
+* Periodic health checks
+* Scheduled data synchronization
+
+Scheduled trigger example:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ every: 5m
+```
+
+Refer to [](/explore-analyze/workflows/triggers/scheduled-triggers.md) for more information.
+
+### Alert triggers
+
+Alert triggers run workflows automatically when a detection or alerting rule generates an alert. The workflow receives the full alert context, including all fields and values.
+
+Use alert triggers for:
+
+* Alert enrichment and triage
+* Automated incident response
+* Case creation and assignment
+* Notification routing based on alert severity
+
+Alert trigger example:
+
+```yaml
+triggers:
+ - type: alert
+```
+
+Refer to [](/explore-analyze/workflows/triggers/alert-triggers.md) for more information.
+
+## Trigger context
+
+Each trigger type provides different data to the workflow context through the `event` field:
+
+* **Manual**: User information and any parameters passed
+* **Scheduled**: Execution time and schedule information
+* **Alert**: Complete alert data including fields, severity, and rule information
+
+Access trigger data in your workflow using template variables:
+
+```yaml
+steps:
+ - name: logTriggerInfo
+ type: console
+ with:
+ message: "Workflow started at {{ execution.startedAt }}"
+ details: "Event data: {{ event | json(2) }}"
+```
\ No newline at end of file
diff --git a/explore-analyze/workflows/triggers/alert-triggers.md b/explore-analyze/workflows/triggers/alert-triggers.md
new file mode 100644
index 0000000000..8ba94e13ac
--- /dev/null
+++ b/explore-analyze/workflows/triggers/alert-triggers.md
@@ -0,0 +1,77 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Understand alert triggers and how to create and configure them.
+---
+
+# Alert triggers
+
+Alert triggers run workflows automatically when detection or alerting rules generate an alert. Use alert triggers for alert enrichment, automated incident response, case creation, or notification routing.
+
+When a rule generates an alert that triggers your workflow, the trigger provides rich context data to the workflow through the `event` field.
+
+To set up an alert trigger, follow these steps:
+
+:::::{stepper}
+
+::::{step} Define an alert trigger
+Create a workflow with an alert trigger:
+
+```yaml
+name: Security Alert Response
+description: Enriches and triages security alerts
+enabled: true
+triggers:
+ - type: alert
+steps:
+ ....
+```
+::::
+
+::::{step} Configure the alert rule
+After creating your workflow, configure your alert rule to trigger it.
+
+::::{tab-set}
+
+:::{tab-item} Alerting rules
+1. Go to **{{rules-ui}}** in **{{stack-manage-app}}** or use the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
+2. Find or create the alerting rule you want to trigger the workflow.
+3. In the rule settings, under **Actions**, select **Add action**.
+4. Select **Workflows**.
+5. Select your workflow from the dropdown or create a new one. You can only select enabled workflows.
+6. Under **Action frequency**, choose whether to run separate workflows for each generated alert.
+7. (Optional) Add multiple workflows by selecting **Add action** again.
+8. Create or save the rule.
+
+:::{image} /explore-analyze/images/workflows-alerting-rule-action.png
+:alt: Alerting rule settings showing a workflow selected as an action
+:screenshot:
+:::
+
+:::
+
+:::{tab-item} Security detection rules
+1. Go to **Detection rules (SIEM)** in the navigation menu or use the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md).
+2. Find or create the detection rule you want to trigger the workflow.
+3. In the rule settings, under **Actions**, select **Workflows**.
+4. Select your workflow from the dropdown or create a new one. You can only select enabled workflows.
+5. Under **Action frequency**, choose whether to run separate workflows for each generated alert.
+6. (Optional) Add multiple workflows by selecting **Add action**.
+7. Create or save the rule.
+
+:::{image} /explore-analyze/images/workflows-detection-rule-action.png
+:alt: Detection rule settings showing a workflow selected as an action
+:screenshot:
+:::
+
+:::
+
+::::
+
+::::
+
+:::::
+
+When the configured rule generates an alert, your workflow automatically executes with the alert context.
+
diff --git a/explore-analyze/workflows/triggers/manual-triggers.md b/explore-analyze/workflows/triggers/manual-triggers.md
new file mode 100644
index 0000000000..90ea990e02
--- /dev/null
+++ b/explore-analyze/workflows/triggers/manual-triggers.md
@@ -0,0 +1,62 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Understand manual triggers and how to create and configure them.
+---
+
+# Manual triggers
+
+Manual triggers run workflows on-demand through the UI or API. They require explicit user action to start a workflow. Use manual triggers for testing, one-off tasks, administrative actions, or workflows that require a human decision to start.
+
+To define a manual trigger, use the following syntax:
+
+```yaml
+triggers:
+ - type: manual
+```
+
+This allows you to run a workflow manually by:
+
+* Clicking **Run** in the Workflows UI
+* Calling the workflow execution API, either directly or from an external system
+
+## Input parameters
+
+Manual triggers can accept input parameters, which you can reference in any step. When you define inputs at the workflow level, users are prompted to provide values when they run the workflow.
+
+```yaml
+name: Manual Processing Workflow
+inputs:
+ - name: environment
+ type: string
+ required: true
+ default: "staging"
+ description: "Target environment for processing"
+
+ - name: batchSize
+ type: number
+ required: false
+ default: 100
+ description: "Number of records to process"
+
+ - name: dryRun
+ type: boolean
+ required: false
+ default: true
+ description: "Run in test mode without making changes"
+
+triggers:
+ - type: manual
+
+steps:
+ - name: validateInputs
+ type: console
+ with:
+ message: |
+ Starting workflow with:
+ - Environment: {{ inputs.environment }}
+ - Batch Size: {{ inputs.batchSize }}
+ - Dry Run: {{ inputs.dryRun }}
+```
+
diff --git a/explore-analyze/workflows/triggers/scheduled-triggers.md b/explore-analyze/workflows/triggers/scheduled-triggers.md
new file mode 100644
index 0000000000..baf7ebf70d
--- /dev/null
+++ b/explore-analyze/workflows/triggers/scheduled-triggers.md
@@ -0,0 +1,231 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Understand scheduled triggers and how to create and configure them.
+---
+
+# Scheduled triggers
+
+Scheduled triggers run workflows automatically at specific times or intervals, without requiring manual intervention. Use scheduled triggers for recurring tasks like reports, data cleanup, or periodic health checks.
+
+You can configure scheduled triggers using:
+
+* **Interval-based scheduling**: Run on a recurring interval (every _x_ minutes, hours, or days)
+* **Recurrence rule (RRule) expressions**: Run at specific times in the specified timezone (for example, daily at 2 AM EST)
+
+## Interval-based scheduling
+
+Interval-based scheduling runs a workflow repeatedly at a fixed interval.
+
+The following example shows the basic syntax for an interval-based scheduled trigger:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ every:
+```
+
+The supported units are:
+
+* Seconds: `s` (minimum supported value: `30s`)
+* Minutes: `m`
+* Hours: `h`
+* Days: `d`
+
+### Examples [interval-examples]
+
+Every 5 minutes:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ every: 5m
+```
+
+Every hour:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ every: 1h
+```
+
+Every day:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ every: 1d
+```
+
+Every week:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ every: 7d
+```
+
+## RRule-based scheduling
+
+RRule-based scheduling runs a workflow at specific times using recurrence rules. This option supports daily, weekly, and monthly frequencies with timezone awareness.
+
+The following example shows the basic syntax for an RRule-based scheduled trigger:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: DAILY
+ interval: 1
+ tzid: UTC
+ dtstart: 2024-01-15T09:00:00Z
+ byhour: []
+ byminute: []
+ byweekday: []
+ bymonthday: []
+```
+
+### RRule fields
+
+The following table describes the available fields for configuring RRule-based scheduled triggers:
+
+| Field | Required | Description | Values |
+| --- | --- | --- | --- |
+| `freq` | Yes | Frequency type | `DAILY`, `WEEKLY`, or `MONTHLY` |
+| `interval` | Yes | Interval between occurrences | Positive integer (for example, `2` with `freq: WEEKLY` runs every 2 weeks) |
+| `tzid` | Yes | Timezone identifier | For example, `UTC`, `America/New_York`, `Europe/London` |
+| `dtstart` | No | Start date | ISO format (for example, `2024-01-15T09:00:00Z`) |
+| `byhour` | No | Hours to run | Array of integers `0`-`23` |
+| `byminute` | No | Minutes to run | Array of integers `0`-`59` |
+| `byweekday` | Required when `freq` is `WEEKLY` | Days of the week | Array of weekdays: `MO`, `TU`, `WE`, `TH`, `FR`, `SA`, `SU` |
+| `bymonthday` | Required when `freq` is `MONTHLY` | Days of the month | Array of integers `1`-`31`. Use negative values to count from the end of the month (for example, -1 for the last day of the month) |
+
+### Examples [rrule-examples]
+
+Daily at multiple times (6 AM, 12 PM, 6 PM) UTC:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: DAILY
+ interval: 1
+ tzid: UTC
+ byhour: [6, 12, 18]
+ byminute: [0]
+```
+
+Daily with a custom start date at 9 AM UTC:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: DAILY
+ interval: 1
+ tzid: UTC
+ dtstart: 2024-01-15T09:00:00Z
+ byhour: [9]
+ byminute: [0]
+```
+
+Every weekday at 8 AM and 5 PM EST:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: DAILY
+ interval: 1
+ tzid: America/New_York
+ byweekday: [MO, TU, WE, TH, FR]
+ byhour: [8, 17]
+ byminute: [0]
+```
+
+Weekly - every Tuesday at 10:30 AM UTC:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: WEEKLY
+ interval: 1
+ tzid: UTC
+ byweekday: [TU]
+ byhour: [10]
+ byminute: [30]
+```
+
+Every 2 weeks on Monday at 9 AM UTC:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: WEEKLY
+ interval: 2
+ tzid: UTC
+ byweekday: [MO]
+ byhour: [9]
+ byminute: [0]
+```
+
+Monthly on 1st and 15th at 10:30 AM UTC:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: MONTHLY
+ interval: 1
+ tzid: UTC
+ bymonthday: [1, 15]
+ byhour: [10]
+ byminute: [30]
+```
+
+Monthly on the last day of the month at 11 PM UTC:
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: MONTHLY
+ interval: 1
+ tzid: UTC
+ bymonthday: [-1]
+ byhour: [23]
+ byminute: [0]
+```
+
+Business hours monitoring (weekdays at 8 AM and 5 PM EST):
+
+```yaml
+triggers:
+ - type: scheduled
+ with:
+ rrule:
+ freq: DAILY
+ interval: 1
+ tzid: America/New_York
+ byweekday: [MO, TU, WE, TH, FR]
+ byhour: [8, 17]
+ byminute: [0]
+```
+
diff --git a/explore-analyze/workflows/use-cases.md b/explore-analyze/workflows/use-cases.md
new file mode 100644
index 0000000000..5bc30b7261
--- /dev/null
+++ b/explore-analyze/workflows/use-cases.md
@@ -0,0 +1,8 @@
+---
+applies_to:
+ stack: preview 9.3
+ serverless: preview
+description: Learn about common workflow usecases for search, observability, and security.
+---
+
+# Use cases
\ No newline at end of file