Skip to content

Commit a9a0338

Browse files
Merge pull request #267644 from cloga/lochen/self-help-update
update according self help recommendation
2 parents f559be9 + 98c5ba3 commit a9a0338

File tree

4 files changed

+23
-3
lines changed

4 files changed

+23
-3
lines changed

articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ Workspace managed virtual network is the recommended way to support network isol
8686
- If you have strict outbound rule, make sure you have open the [Required public internet access](../how-to-secure-workspace-vnet.md#required-public-internet-access).
8787
- Add workspace MSI as `Storage File Data Privileged Contributor` to storage account linked with workspace. Please follow step 2 in [Secure prompt flow with workspace managed virtual network](#secure-prompt-flow-with-workspace-managed-virtual-network).
8888
- Meanwhile, you can follow [private Azure Cognitive Services](../../ai-services/cognitive-services-virtual-networks.md) to make them as private.
89-
- If you want to deploy prompt flow in workspace which secured by your own virtual network, you can deploy it to AKS cluster which is in the same virtual network. You can follow [Secure Azure Kubernetes Service inferencing environment](../how-to-secure-kubernetes-inferencing-environment.md) to secure your AKS cluster.
89+
- If you want to deploy prompt flow in workspace which secured by your own virtual network, you can deploy it to AKS cluster which is in the same virtual network. You can follow [Secure Azure Kubernetes Service inferencing environment](../how-to-secure-kubernetes-inferencing-environment.md) to secure your AKS cluster. Learn more about [How to deploy prompt flow to ASK cluster via code](./how-to-deploy-to-code.md).
9090
- You can either create private endpoint to the same virtual network or leverage virtual network peering to make them communicate with each other.
9191
9292
## Known limitations

articles/machine-learning/prompt-flow/tools-reference/llm-tool.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,8 @@ Prompt flow provides a few different large language model APIs:
2525

2626
> [!NOTE]
2727
> We removed the `embedding` option from the LLM tool API setting. You can use an embedding API with the [embedding tool](embedding-tool.md).
28+
> Only key-based authentication is supported for Azure OpenAI connection.
29+
> Please don't use non-ascii characters in resource group name of Azure OpenAI resource, prompt flow didn't support this case.
2830
2931
## Prerequisites
3032

articles/machine-learning/prompt-flow/tools-reference/troubleshoot-guidance.md

Lines changed: 16 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -181,6 +181,21 @@ Follow these steps to find Python packages installed in compute instance runtime
181181

182182
:::image type="content" source="../media/faq/list-packages.png" alt-text="Screenshot that shows finding Python packages installed in runtime." lightbox = "../media/faq/list-packages.png":::
183183

184+
### Runtime start failures using custom environment
185+
186+
#### CI (Compute instance) runtime start failure using custom environment
187+
188+
To use promptflow as runtime on CI, you need use the base image provide by promptflow. If you want to add extra packages to the base image, you need follow the [Customize environment with Docker context for runtime](../how-to-customize-environment-runtime.md) to create a new environment. Then use it to create CI runtime.
189+
190+
If you got `UserError: FlowRuntime on compute instance is not ready`, you need login into to terminal of CI and run `journalctl -u c3-progenitor.serivice` to check the logs.
191+
192+
#### Automatic runtime start failure with requirements.txt or custom base image
193+
194+
Automatic runtime support to use `requirements.txt` or custom base image in `flow.dag.yaml` to customize the image. We would recommend you to use `requirements.txt` for common case, which will use `pip install -r requirements.txt` to install the packages. If you have dependency more then python packages, you need follow the [Customize environment with Docker context for runtime](../how-to-customize-environment-runtime.md) to create build a new image base on top of promptflow base image. Then use it in `flow.dag.yaml`. Learn more about [Customize environment with Docker context for runtime](../how-to-create-manage-runtime.md#update-an-automatic-runtime-preview-on-a-flow-page).
195+
196+
- You can not use arbitrary base image to create runtime, you need use the base image provide by promptflow.
197+
- Don't pin the version of `promptflow` and `promptflow-tools` in `requirements.txt`, because we already include them in the runtime base image. Using old version of `promptflow` and `promptflow-tools` may cause unexpected behavior.
198+
=======
184199
## Flow run related issues
185200

186201
### How to find the raw inputs and outputs of in LLM tool for further investigation?
@@ -197,4 +212,4 @@ In prompt flow, on flow page with successful run and run detail page, you can fi
197212

198213
You may encounter 409 error from Azure OpenAI, it means you have reached the rate limit of Azure OpenAI. You can check the error message in the output section of LLM node. Learn more about [Azure OpenAI rate limit](../../../ai-services/openai/quotas-limits.md).
199214

200-
:::image type="content" source="../media/faq/429-rate-limit.png" alt-text="Screenshot that shows 429 rate limit error from Azure OpenAI." lightbox = "../media/faq/429-rate-limit.png":::
215+
:::image type="content" source="../media/faq/429-rate-limit.png" alt-text="Screenshot that shows 429 rate limit error from Azure OpenAI." lightbox = "../media/faq/429-rate-limit.png":::

articles/machine-learning/prompt-flow/tools-reference/vector-index-lookup-tool.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -85,5 +85,8 @@ The following example is for a JSON format response returned by the tool, which
8585
}
8686
}
8787
]
88-
8988
```
89+
90+
## Deploying to an online endpoint
91+
92+
When you deploy a flow containing the vector index lookup tool to an online endpoint, there's an extra step to set up permissions. During deployment through the web pages, there's a choice between System-assigned and User-assigned Identity types. Either way, using the Azure portal (or a similar functionality), add the "AzureML Data Scientist" role of Azure Machine learning studio to the identity assign to the endpoint.

0 commit comments

Comments
 (0)