Skip to content

Commit 8020c8b

Browse files
committed
fixed according PR review
1 parent 8fb9bb8 commit 8020c8b

File tree

3 files changed

+5
-5
lines changed

3 files changed

+5
-5
lines changed

articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -91,7 +91,7 @@ Workspace managed virtual network is the recommended way to support network isol
9191
9292
## Known limitations
9393
94-
- Workspace hub / lean workspace and AI studio don't support bring your own virtual network.
94+
- AI studio don't support bring your own virtual network only support workspace managed virtual network.
9595
- Managed online endpoint only supports workspace with managed virtual network. If you want to use your own virtual network, you might need one workspace for prompt flow authoring with your virtual network and another workspace for prompt flow deployment using managed online endpoint with workspace managed virtual network.
9696

9797
## Next steps

articles/machine-learning/prompt-flow/tools-reference/troubleshoot-guidance.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -170,16 +170,16 @@ Follow these steps to find Python packages installed in runtime:
170170

171171
### How to find the raw inputs and outputs of in LLM tool for further investigation?
172172

173-
In prompt flow flow page with successful run, you can find the raw inputs and outputs of LLM tool in the output section. Click the `view full output` button to view full output.
173+
In prompt flow, on flow page with successful run and run detail page, you can find the raw inputs and outputs of LLM tool in the output section. Click the `view full output` button to view full output.
174174

175175
:::image type="content" source="../media/faq/view-full-output.png" alt-text="Screenshot that shows view full output on LLM node." lightbox = "../media/faq/view-full-output.png":::
176176

177-
`Trace` section include the each request and response to LLM tool, you can check raw message send to LLM model and raw response from LLM model.
177+
`Trace` section includes each request and response to the LLM tool. You can check the raw message sent to the LLM model and the raw response from the LLM model.
178178

179-
:::image type="content" source="../media/faq/trace-llm-tool.png" alt-text="Screenshot that shows raw request send to LLM model and response from LLM model." lightbox = "../media/faq/trace-llm-tool.png":::
179+
:::image type="content" source="../media/faq/trace-large-language-model-tool.png" alt-text="Screenshot that shows raw request send to LLM model and response from LLM model." lightbox = "../media/faq/trace-large-language-model-tool.png":::
180180

181181
## How to fix 409 error in from Azure OpenAI?
182182

183-
You may encounter 409 error from Azure OpenAI, it means you have reached the rate limit of Azure OpenAI. You can check the error message in the output section of LLM node. Lean more about [Azure OpenAI rate limit](../../../ai-services/openai/quotas-limits.md).
183+
You may encounter 409 error from Azure OpenAI, it means you have reached the rate limit of Azure OpenAI. You can check the error message in the output section of LLM node. Learn more about [Azure OpenAI rate limit](../../../ai-services/openai/quotas-limits.md).
184184

185185
:::image type="content" source="../media/faq/429-rate-limit.png" alt-text="Screenshot that shows 429 rate limit error from Azure OpenAI." lightbox = "../media/faq/429-rate-limit.png":::

0 commit comments

Comments
 (0)