You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-retrieval-augmented-generation-cloud-to-local.md
+13-14Lines changed: 13 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -45,21 +45,21 @@ To complete the procedures in this article, you need the following prerequisites
45
45
46
46
This tutorial uses the sample **Q&A on Your Data** RAG prompt flow. This flow contains a **lookup** node that uses the vector index lookup tool to search questions from the indexed docs. The index docs are stored in the workspace storage blob.
47
47
48
-
1. On the **Connections** tab of the Azure Machine Learning studio **Prompt flow** page, [set up a connection](prompt-flow/get-started-prompt-flow.md#set-up-connection) to your Azure OpenAI resource if you don't have one.
48
+
1. On the **Connections** tab of the Azure Machine Learning studio **Prompt flow** page, [set up a connection](prompt-flow/get-started-prompt-flow.md#set-up-connection) to your Azure OpenAI resource if you don't already have one.
49
49
50
50
1. Select **Create** on the Azure Machine Learning studio **Prompt flow** page, and on the **Create a new flow** screen, select **Clone** on the **Q&A on Your Data** tile to clone the prompt flow.
51
51
52
52
The cloned flow opens in the authoring interface.
53
53
54
54
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/my-flow.png" alt-text="Screenshot of bring your own data QnA in the Azure Machine Learning studio." lightbox = "./media/how-to-retrieval-augmented-generation-cloud-to-local/my-flow.png":::
55
55
56
-
1. In your cloned flow, populate the **answer_the_question_with_context**step with your **Connection** and **Deployment**information for the **chat** API.
56
+
1. In the **lookup** step of your cloned flow, populate the **mlindex_content**input with your vector index information.
57
57
58
-
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/my-cloud-connection.png" alt-text="Screenshot of answer_the_question_with_context node in studio showing inputs.":::
58
+
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/embed-question.png" alt-text="Screenshot of lookup node in studio showing inputs.":::
59
59
60
-
1. Populate the **mlindex_content**input in the**lookup**step with your vector index information.
60
+
1. Populate the **answer_the_question_with_context**step with your**Connection**and **Deployment** information for the **chat** API.
61
61
62
-
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/embed-question.png" alt-text="Screenshot of lookup node in studio showing inputs.":::
62
+
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/my-cloud-connection.png" alt-text="Screenshot of answer_the_question_with_context node in studio showing inputs.":::
63
63
64
64
1. Make sure the example flow runs correctly, and save it.
65
65
@@ -73,11 +73,11 @@ This tutorial uses the sample **Q&A on Your Data** RAG prompt flow. This flow co
73
73
74
74
The rest of this article details how to use the VS Code Prompt flow extension to edit the flow. If you don't want to use the Prompt flow extension, you can open the unzipped folder in any integrated development environment (IDE) and use the CLI to edit the files. For more information, see the [Prompt flow quick start](https://microsoft.github.io/promptflow/how-to-guides/quick-start.html#quick-start).
75
75
76
-
1. In VS Code with the Prompt Flow extension enabled, open the unzipped prompt flow folder.
76
+
1. In VS Code with the Prompt flow extension enabled, open the unzipped prompt flow folder.
77
77
78
78
1. Select the **Prompt flow** icon in the left menu to open the Prompt flow management pane.
79
79
80
-
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/vs-code-extension.png" alt-text="Screenshot of the prompt flow VS Code extension icon in the VS Code left menu.":::
80
+
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/vs-code-extension-toolbar.png" alt-text="Screenshot of the prompt flow VS Code extension icon in the VS Code left menu.":::
81
81
82
82
### Create the connections
83
83
@@ -102,11 +102,7 @@ To use the vector index lookup tool locally, you need to create the same connect
102
102
103
103
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/visual-editor.png" alt-text="Screenshot of the flow dag yaml file with the visual editor highlighted in VS Code." lightbox = "./media/how-to-retrieval-augmented-generation-cloud-to-local/visual-editor.png":::
104
104
105
-
1. In the visual editor version of *flow.dag.yaml*, scroll to the **answer_the_question_with_context** node and make sure the connection is the same as the local connection you created. Check the **deployment_name**, which is the model you use here for the embedding.
106
-
107
-
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/answer-connection.png" alt-text="Screenshot of answer the question with context node with the connection highlighted.":::
108
-
109
-
1. Scroll to the **lookup** node, which consumes the vector index lookup tool in this flow. Check the path of your indexed docs you specify. All publicly accessible paths are supported.
105
+
1. In the visual editor version of *flow.dag.yaml*, scroll to the **lookup** node, which consumes the vector index lookup tool in this flow. Check the path of your indexed docs you specify. All publicly accessible paths are supported.
110
106
111
107
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/search-blob.png" alt-text="Screenshot of search question from indexed docs node in VS Code showing the inputs.":::
112
108
@@ -125,9 +121,13 @@ To use the vector index lookup tool locally, you need to create the same connect
125
121
126
122
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/generate-node.png" alt-text="Screenshot of the Python code file with the vector tool package name highlighted.":::
127
123
124
+
1. Scroll to the **answer_the_question_with_context** node and make sure the connection is the same as the local connection you created. Check the **deployment_name**, which is the model you use here for the embedding.
125
+
126
+
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/answer-connection.png" alt-text="Screenshot of answer the question with context node with the connection highlighted.":::
127
+
128
128
### Test and run the flow
129
129
130
-
Scroll to the top of the flow and fill in the **Inputs** value with a single question for this test run, for example**How to use SDK V2?**, and then select the **Run** icon to run the flow.
130
+
Scroll to the top of the flow and fill in the **Inputs** value with a single question for this test run, such as**How to use SDK V2?**, and then select the **Run** icon to run the flow.
131
131
132
132
:::image type="content" source="./media/how-to-retrieval-augmented-generation-cloud-to-local/flow-run.png" alt-text="Screenshot of the flow dag YAML file showing inputs and highlighting value of the question input and run button.":::
133
133
@@ -136,6 +136,5 @@ For more information about batch run and evaluation, see [Submit flow run to Azu
136
136
## Related content
137
137
138
138
-[Get started with prompt flow](prompt-flow/get-started-prompt-flow.md)
139
-
-[Create and deploy an Azure OpenAI Service resource](/azure/ai-services/openai/how-to/create-resource)
140
139
-[Create a vector index in an Azure Machine Learning prompt flow (preview)](how-to-create-vector-index.md)
141
140
-[Integrate prompt flow with LLM-based application DevOps](prompt-flow/how-to-integrate-with-llm-app-devops.md)
0 commit comments