You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/how-to-secure-prompt-flow.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -86,7 +86,7 @@ Workspace managed virtual network is the recommended way to support network isol
86
86
- If you have strict outbound rule, make sure you have open the [Required public internet access](../how-to-secure-workspace-vnet.md#required-public-internet-access).
87
87
- Add workspace MSI as `Storage File Data Privileged Contributor` to storage account linked with workspace. Please follow step 2 in [Secure prompt flow with workspace managed virtual network](#secure-prompt-flow-with-workspace-managed-virtual-network).
88
88
- Meanwhile, you can follow [private Azure Cognitive Services](../../ai-services/cognitive-services-virtual-networks.md) to make them as private.
89
-
- If you want to deploy prompt flow in workspace which secured by your own virtual network, you can deploy it to AKS cluster which is in the same virtual network. You can follow [Secure Azure Kubernetes Service inferencing environment](../how-to-secure-kubernetes-inferencing-environment.md) to secure your AKS cluster.
89
+
- If you want to deploy prompt flow in workspace which secured by your own virtual network, you can deploy it to AKS cluster which is in the same virtual network. You can follow [Secure Azure Kubernetes Service inferencing environment](../how-to-secure-kubernetes-inferencing-environment.md) to secure your AKS cluster. Learn more about [How to deploy prompt flow to ASK cluster via code](./how-to-deploy-to-code.md).
90
90
- You can either create private endpoint to the same virtual network or leverage virtual network peering to make them communicate with each other.
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/tools-reference/troubleshoot-guidance.md
+16-1Lines changed: 16 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -181,6 +181,21 @@ Follow these steps to find Python packages installed in compute instance runtime
181
181
182
182
:::image type="content" source="../media/faq/list-packages.png" alt-text="Screenshot that shows finding Python packages installed in runtime." lightbox ="../media/faq/list-packages.png":::
183
183
184
+
### Runtime start failures using custom environment
185
+
186
+
#### CI (Compute instance) runtime start failure using custom environment
187
+
188
+
To use promptflow as runtime on CI, you need use the base image provide by promptflow. If you want to add extra packages to the base image, you need follow the [Customize environment with Docker context for runtime](../how-to-customize-environment-runtime.md) to create a new environment. Then use it to create CI runtime.
189
+
190
+
If you got `UserError: FlowRuntime on compute instance isnot ready`, you need login into to terminal of CIand run `journalctl -u c3-progenitor.serivice` to check the logs.
191
+
192
+
#### Automatic runtime start failure with requirements.txt or custom base image
193
+
194
+
Automatic runtime support to use `requirements.txt`or custom base image in`flow.dag.yaml` to customize the image. We would recommend you to use `requirements.txt`for common case, which will use `pip install -r requirements.txt` to install the packages. If you have dependency more then python packages, you need follow the [Customize environment with Docker context for runtime](../how-to-customize-environment-runtime.md) to create build a new image base on top of promptflow base image. Then use it in`flow.dag.yaml`. Learn more about [Customize environment with Docker context for runtime](../how-to-create-manage-runtime.md#update-an-automatic-runtime-preview-on-a-flow-page).
195
+
196
+
- You can not use arbitrary base image to create runtime, you need use the base image provide by promptflow.
197
+
- Don't pin the version of `promptflow` and `promptflow-tools` in `requirements.txt`, because we already include them in the runtime base image. Using old version of `promptflow` and `promptflow-tools` may cause unexpected behavior.
198
+
=======
184
199
## Flow run related issues
185
200
186
201
### How to find the raw inputs and outputs of in LLM tool for further investigation?
@@ -197,4 +212,4 @@ In prompt flow, on flow page with successful run and run detail page, you can fi
197
212
198
213
You may encounter 409 error from Azure OpenAI, it means you have reached the rate limit of Azure OpenAI. You can check the error message in the output section of LLM node. Learn more about [Azure OpenAI rate limit](../../../ai-services/openai/quotas-limits.md).
199
214
200
-
:::image type="content" source="../media/faq/429-rate-limit.png" alt-text="Screenshot that shows 429 rate limit error from Azure OpenAI." lightbox ="../media/faq/429-rate-limit.png":::
215
+
:::image type="content"source="../media/faq/429-rate-limit.png" alt-text="Screenshot that shows 429 rate limit error from Azure OpenAI."lightbox="../media/faq/429-rate-limit.png":::
Copy file name to clipboardExpand all lines: articles/machine-learning/prompt-flow/tools-reference/vector-index-lookup-tool.md
+4-1Lines changed: 4 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -85,5 +85,8 @@ The following example is for a JSON format response returned by the tool, which
85
85
}
86
86
}
87
87
]
88
-
89
88
```
89
+
90
+
## Deploying to an online endpoint
91
+
92
+
When you deploy a flow containing the vector index lookup tool to an online endpoint, there's an extra step to set up permissions. During deployment through the web pages, there's a choice between System-assigned and User-assigned Identity types. Either way, using the Azure portal (or a similar functionality), add the "AzureML Data Scientist" role of Azure Machine learning studio to the identity assign to the endpoint.
0 commit comments