Skip to content

Commit 2c71d14

Browse files
authored
Remove preview attributes
1 parent 4a5c1e5 commit 2c71d14

File tree

1 file changed

+3
-7
lines changed

1 file changed

+3
-7
lines changed

articles/logic-apps/parse-document-chunk-text.md

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,19 +6,15 @@ ms.suite: integration
66
ms.collection: ce-skilling-ai-copilot
77
ms.reviewer: estfan, azla
88
ms.topic: how-to
9-
ms.date: 02/11/2025
9+
ms.date: 08/14/2025
1010
ms.update-cycle: 180-days
11-
# Customer intent: As a developer using Azure Logic Apps, I want to parse a document or chunk text that I want to use with Azure AI operations for my workflow in Azure Logic Apps.
11+
# Customer intent: As an integration developer using Azure Logic Apps, I want to parse a document or chunk text that I want to use with Azure AI operations for my workflow in Azure Logic Apps.
1212
---
1313

14-
# Parse or chunk content for workflows in Azure Logic Apps (Preview)
14+
# Parse or chunk content for workflows in Azure Logic Apps
1515

1616
[!INCLUDE [logic-apps-sku-consumption-standard](../../includes/logic-apps-sku-consumption-standard.md)]
1717

18-
> [!IMPORTANT]
19-
> This capability is in preview and is subject to the
20-
> [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
21-
2218
Sometimes you have to convert content into tokens, which are words or chunks of characters, or divide a large document into smaller pieces before you can use this content with some actions. For example, the **Azure AI Search** or **Azure OpenAI** actions expect tokenized input and can handle only a limited number of tokens.
2319

2420
For these scenarios, use the **Data Operations** actions named **Parse a document** and **Chunk text** in your logic app workflow. These actions respectively transform content, such as a PDF document, CSV file, Excel file, and so on, into tokenized string output and then split the string into pieces, based on the number of tokens. You can then reference and use these outputs with subsequent actions in your workflow.

0 commit comments

Comments
 (0)