You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md
+1-2Lines changed: 1 addition & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -42,7 +42,7 @@ Workspace managed virtual network is the recommended way to support network isol
42
42
az ml workspace provision-network --subscription <sub_id> -g <resource_group_name> -n <workspace_name>
43
43
```
44
44
45
-
2. Add workspace MSI as `Storage File Data Privileged Contributor`and `Storage Table Data Contributor`to storage account linked with workspace.
45
+
2. Add workspace MSI as `Storage File Data Privileged Contributor` to storage account linked with workspace.
46
46
47
47
2.1 Go to Azure portal, find the workspace.
48
48
@@ -66,7 +66,6 @@ Workspace managed virtual network is the recommended way to support network isol
66
66
:::image type="content" source="./media/how-to-secure-prompt-flow/managed-identity-workspace.png" alt-text="Diagram showing how to assign storage file data privileged contributor role to workspace managed identity." lightbox = "./media/how-to-secure-prompt-flow/managed-identity-workspace.png":::
67
67
68
68
> [!NOTE]
69
-
> You need follow the same process to assign `Storage Table Data Contributor` role to workspace managed identity.
70
69
> This operation might take several minutes to take effect.
71
70
72
71
3. If you want to communicate with [private Azure Cognitive Services](../../ai-services/cognitive-services-virtual-networks.md), you need to add related user defined outbound rules to related resource. The Azure Machine Learning workspace creates private endpoint in the related resource with auto approve. If the status is stuck in pending, go to related resource to approve the private endpoint manually.
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/tools-reference/troubleshoot-guidance.md
+20-2Lines changed: 20 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
2
title: Troubleshoot guidance
3
3
titleSuffix: Azure Machine Learning
4
-
description: This article addresses frequent questions about tool usage.
4
+
description: This article addresses frequent questions prompt flow usage.
5
5
services: machine-learning
6
6
ms.service: machine-learning
7
7
ms.subservice: prompt-flow
@@ -16,7 +16,7 @@ ms.date: 09/05/2023
16
16
17
17
# Troubleshoot guidance
18
18
19
-
This article addresses frequent questions about tool usage.
19
+
This article addresses frequent questions about prompt flow usage.
20
20
21
21
## "Package tool isn't found" error occurs when you update the flow for a code-first experience
22
22
@@ -165,3 +165,21 @@ Follow these steps to find Python packages installed in runtime:
165
165
- Run the flow. Then you can find `packages.txt`in the flow folder.
166
166
167
167
:::image type="content" source="../media/faq/list-packages.png" alt-text="Screenshot that shows finding Python packages installed in runtime." lightbox ="../media/faq/list-packages.png":::
168
+
169
+
## Flow run related issues
170
+
171
+
### How to find the raw inputs and outputs of in LLM tool for further investigation?
172
+
173
+
In prompt flow flow page with successful run, you can find the raw inputs and outputs of LLM tool in the output section. Click the `view full output` button to view full output.
174
+
175
+
:::image type="content" source="../media/faq/view-full-output.png" alt-text="Screenshot that shows view full output on LLM node." lightbox ="../media/faq/view-full-output.png":::
176
+
177
+
`Trace` section should the each request and response to LLM tool, you can check raw message send to LLM model and raw response fromLLM model.
178
+
179
+
:::image type="content" source="../media/faq/trace-llm-tool.png" alt-text="Screenshot that shows raw request send to LLM model and response from LLM model." lightbox ="../media/faq/trace-llm-tool.png":::
180
+
181
+
## How to fix 409 error in from Azure OpenAI?
182
+
183
+
If you may encounter 409 error from Azure OpenAI, it means you have reached the rate limit of Azure OpenAI. You can check the error message in the output section of LLM node. Lean more about [Azure OpenAI rate limit](../../../ai-services/openai/quotas-limits.md).
184
+
185
+
:::image type="content" source="../media/faq/429-rate-limit.png" alt-text="Screenshot that shows 429 rate limit error from Azure OpenAI." lightbox ="../media/faq/429-rate-limit.png":::
0 commit comments