Skip to content

Commit 6a2d6e9

Browse files
committed
fix
1 parent 84dae31 commit 6a2d6e9

File tree

1 file changed

+33
-33
lines changed

1 file changed

+33
-33
lines changed

articles/machine-learning/prompt-flow/how-to-develop-flow.md

Lines changed: 33 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Develop a prompt flow
2+
title: Develop prompt flow
33
titleSuffix: Azure Machine Learning
44
description: Learn how to develop a prompt flow and a chat flow in Azure Machine Learning studio.
55
services: machine-learning
@@ -14,7 +14,7 @@ ms.author: lagayhar
1414
ms.reviewer: jinzhong
1515
ms.date: 10/15/2024
1616
---
17-
# Develop a prompt flow
17+
# Develop prompt flow
1818

1919
Prompt flow is a development tool that streamlines the development cycle of AI applications that are powered by Large Language Models (LLMs). In this article, you learn how to create and develop a prompt flow and a chat flow in Azure Machine Learning studio.
2020

@@ -24,7 +24,7 @@ As the momentum for LLM-based AI applications grows, prompt flow provides a comp
2424
- Easily test, debug, and iterate your flows.
2525
- Create prompt variants and compare their performance.
2626

27-
## Create and develop a prompt flow
27+
## Create and develop your prompt flow
2828

2929
To create a prompt flow, select **Prompt flow** in the Azure Machine Learning studio left navigation, and then select **Create** on the **Prompt flow** page.
3030

@@ -76,13 +76,13 @@ You can set node **Inputs** and **Outputs** in the following ways:
7676

7777
After you finish composing a prompt or Python script, select **Validate and parse input** for the system to automatically parse the node input based on the prompt template and Python function input.
7878

79-
You can link nodes by referencing node output. For example, you can reference the LLM node output in the Python node input so the Python node consumes the LLM node output. In the **Graph** view you can see the two nodes linked together.
79+
You can link nodes by referencing node output. For example, you can reference the LLM node output in the Python node input so the Python node consumes the LLM node output. In the **Graph** view, you can see the two nodes linked together.
8080

8181
#### LLM nodes
8282

8383
For an LLM node for Azure OpenAI, you need to select **Connection**, **Api**, and **deployment_name**, and set the **Prompt**. You use the connection to securely store and manage secret keys or other sensitive credentials required for interacting with Azure OpenAI.
8484

85-
If you don't already have a connection, create it before you add an LLM node, and make sure the Azure OpenAI resource has a **chat** or **completion** deployment. For more information, see [Set up a connection](get-started-with-prompt-flow.md#set-up-a-connection) and [Create a resource and deploy a model using Azure OpenAI](/azure/cognitive-services/openai/how-to/create-resource).
85+
If you don't already have a connection, create it before you add an LLM node, and make sure the Azure OpenAI resource has a **chat** or **completion** deployment. For more information, see [Set up a connection](get-started-prompt-flow.md#set-up-a-connection) and [Create a resource and deploy a model using Azure OpenAI](/azure/cognitive-services/openai/how-to/create-resource).
8686

8787
#### Python nodes
8888

@@ -159,34 +159,34 @@ To help you manage chat history, `chat_history` in the **Inputs** section is res
159159

160160
Chat history is structured as a list of inputs and outputs. All interactions in the chat box, including user chat inputs, generated chat outputs, and other flow inputs and outputs, are automatically stored in chat history.
161161

162-
```json
163-
[
164-
{
165-
"inputs": {
166-
"<flow input 1>": "xxxxxxxxxxxxxxx",
167-
"<flow input 2>": "xxxxxxxxxxxxxxx",
168-
"<flow input N>""xxxxxxxxxxxxxxx"
169-
},
170-
"outputs": {
171-
"<flow output 1>": "xxxxxxxxxxxx",
172-
"<flow output 2>": "xxxxxxxxxxxxx",
173-
"<flow output M>": "xxxxxxxxxxxxx"
174-
}
162+
```json
163+
[
164+
{
165+
"inputs": {
166+
"<flow input 1>": "xxxxxxxxxxxxxxx",
167+
"<flow input 2>": "xxxxxxxxxxxxxxx",
168+
"<flow input N>""xxxxxxxxxxxxxxx"
175169
},
176-
{
177-
"inputs": {
178-
"<flow input 1>": "xxxxxxxxxxxxxxx",
179-
"<flow input 2>": "xxxxxxxxxxxxxxx",
180-
"<flow input N>""xxxxxxxxxxxxxxx"
181-
},
182-
"outputs": {
183-
"<flow output 1>": "xxxxxxxxxxxx",
184-
"<flow output 2>": "xxxxxxxxxxxxx",
185-
"<flow output M>": "xxxxxxxxxxxxx"
186-
}
170+
"outputs": {
171+
"<flow output 1>": "xxxxxxxxxxxx",
172+
"<flow output 2>": "xxxxxxxxxxxxx",
173+
"<flow output M>": "xxxxxxxxxxxxx"
187174
}
188-
]
189-
```
175+
},
176+
{
177+
"inputs": {
178+
"<flow input 1>": "xxxxxxxxxxxxxxx",
179+
"<flow input 2>": "xxxxxxxxxxxxxxx",
180+
"<flow input N>""xxxxxxxxxxxxxxx"
181+
},
182+
"outputs": {
183+
"<flow output 1>": "xxxxxxxxxxxx",
184+
"<flow output 2>": "xxxxxxxxxxxxx",
185+
"<flow output M>": "xxxxxxxxxxxxx"
186+
}
187+
}
188+
]
189+
```
190190
> [!NOTE]
191191
> When you conduct tests in the **Chat** box, you automatically save chat history. For batch runs, you must include chat history within the batch run dataset. If there's no chat history available, set the `chat_history` to an empty list `[]` within the batch run dataset.
192192
@@ -217,6 +217,6 @@ The **Chat** box provides an interactive way to test your chat flow by simulatin
217217

218218
## Related content
219219

220-
- [Batch run using more data and evaluate the flow performance](how-to-bulk-test-evaluate-flow.md)
220+
- [Submit batch run and evaluate a flow](how-to-bulk-test-evaluate-flow.md)
221221
- [Tune prompts using variants](how-to-tune-prompts-using-variants.md)
222-
- [Deploy a flow](how-to-deploy-for-real-time-inference.md)
222+
- [Deploy a flow as a managed online endpoint for real-time inference](how-to-deploy-for-real-time-inference.md)

0 commit comments

Comments
 (0)